Waji's Standards

Discussions on more advanced topics such as monolithic vs micro-kernels, transactional memory models, and paging vs segmentation should go here. Use this forum to expand and improve the wiki!
Post Reply
User avatar
Wajideu
Member
Member
Posts: 153
Joined: Wed Jul 30, 2014 1:05 am

Re: Waji's Standards

Post by Wajideu »

b.zaar wrote:So I found this What’s Wrong With GNU make?

And I'm just gonna leave something attached. Feel free to laugh at it then maybe test it on your unix versions. I'm sure it could be improved...
I haven't checked the attachment yet, but I'll laugh at that article if anything. The person doesn't understand the difference between lexical analysis and syntactical/semantical analysis to begin with. Secondly, he doesn't understand the necessity of automatically initialized variables. Offloading extra work like having to type:

Code: Select all

CC = gcc
To the user when it can be implicitly done is just asking to piss people off and cause confusion when the value of CC is dependent upon the environment. For example, an alternative development kit may implicitly set the OBJCOPY variable to something aside from GNU objcopy to allow conversion between their own proprietary object file format. The GNU development kit on the other hand, doesn't set this variable.

He also rants about there being both := and = operators, which again is really stupid. They exist for things like this:

Code: Select all

# here, VAR_A = $(VAL), VAR_B = <nothing>
VAR_A = $(VAL)
VAR_B := $(VAL)
VAL = 4
This is necessary in cases where a variable nets to be modified several times, or when the makefile includes other sub-makefiles that perform that work; which is very common (ie. see devkitpro and the official nintendo development kits).

Next, he complains about patsubst not handling ambiguous naming. That's the developer's fault. They shouldn't have named the files ambiguous names. If they want to be anal about doing that, they can waste their own time trying to find a workaround.

Next he complains about there being no other datatypes aside from strings. It's not a programming language, it's a recipe book. You don't need numbers. But if you absolutely must, Unix has a utility called 'expr' that evaluates the arguments passed to it and returns the answer.

Next is complaining about dependencies. This is mostly why autotools exists. Then he complains about time-stamps being unreliable because other programs can modify them, which is a 'duh', that's how you force Makefile to rebuild a file or ignore it. (touch / touch -d <time>). If you want to force them to be rebuilt, declare them .PHONY, touch them, or do 'make clean all'.

A lot of that article pissed me off. Posing what a programmer might do wrong as a problem with the utility itself, the annoyingly retarded argument
"smell(jar < fart 45%+ time);" doesn't work, therefore C++ is broken! ZOMG, Stroustrup is f*$@n retarded!



The only thing I would agree with him on is that the if statement could be better. Aside from that, most of the true problems with using make would be solved if there was a configuration process prior to calling make.
User avatar
Brendan
Member
Member
Posts: 8561
Joined: Sat Jan 15, 2005 12:00 am
Location: At his keyboard!
Contact:

Re: Waji's Standards

Post by Brendan »

Hi,

When I first started programming...

You'd turn the computer on and get a prompt in about 1 second. You could type a line of code and it'd do it immediately. If what you typed began with a line number it'd insert it into the source code. You could type "run" whenever you like and it'd start running your program. If anything went wrong in your program, or you press "break", you go back to the prompt instantly, and could start debugging by typing line/s of code without line numbers (like "print foo" if you wanted to know what value the variable "foo" had when the program was stopped).

After 30+ years of progress and improvements; it's amazing how much simpler programming has become.


Cheers,

Brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.
User avatar
b.zaar
Member
Member
Posts: 294
Joined: Wed May 21, 2008 4:33 am
Location: Mars MTC +6:00
Contact:

Re: Waji's Standards

Post by b.zaar »

Wajideu wrote:I'll laugh at that article if anything. The person doesn't understand the difference between lexical analysis and syntactical/semantical analysis to begin with. Secondly, he doesn't understand the necessity of automatically initialized variables.
If I recall correctly you didn't understand how make executes a rule by using the shell or that an ifeq ... $(error ...) macro executes immediately so atm I'm going to value someone else's opinion on how make does actually work.
Wajideu wrote:For example, an alternative development kit may implicitly set the OBJCOPY variable to something aside from GNU objcopy to allow conversion between their own proprietary object file format. The GNU development kit on the other hand, doesn't set this variable.
I've been told that an alternative to the GNU tools and standard work practices is a waste of time.
Wajideu wrote:He also rants about there being both := and = operators, which again is really stupid. They exist for things like this:

Code: Select all

# here, VAR_A = $(VAL), VAR_B = <nothing>
VAR_A = $(VAL)
VAR_B := $(VAL)
VAL = 4
This is necessary in cases where a variable nets to be modified several times, or when the makefile includes other sub-makefiles that perform that work; which is very common (ie. see devkitpro and the official nintendo development kits).
What is the current value of $(VAR_A) at the exact time you use it?
Wajideu wrote:Next is complaining about dependencies. This is mostly why autotools exists.
This is back to I write code in Basic to compile to pascal to compile to C to compile to assembly.

My little attachment does some of the work of autotools in make format. Using a bash script (which autotools also relies on) I'm sure I could improve it.
Wajideu wrote:Then he complains about time-stamps being unreliable because other programs can modify them, which is a 'duh', that's how you force Makefile to rebuild a file or ignore it. (touch / touch -d <time>). If you want to force them to be rebuilt, declare them .PHONY, touch them, or do 'make clean all'.
I suggest you look up how nmake (the AT&T/Lucent version, I linked to it earlier) keeps track of changes. His idea was he didn't want things to be rebuilt because a file was touched. This idea works like git where if the file has no difference even though the file may have a later timestamp it is the same file. There's no need to update the git respository with no changes.
Wajideu wrote:A lot of that article pissed me off. Posing what a programmer might do wrong as a problem with the utility itself
If the utility was right why do we need autotools?
Article wrote: Silence is Golden

According to Eric Raymond, “One of Unix’s oldest and most persistent design rules is that when a program has nothing interesting or surprising to say, it should shut up. Well-behaved Unix programs do their jobs unobtrusively, with a minimum of fuss and bother. Silence is golden.” make does not follow this rule.
Doesn't make break one of the golden rules?
Wajideu wrote:The only thing I would agree with him on is that the if statement could be better. Aside from that, most of the true problems with using make would be solved if there was a configuration process prior to calling make.
Again... just a simple little attachment.
"God! Not Unix" - Richard Stallman

Website: venom Dev
OS project: venom OS
Hexadecimal Editor: hexed
User avatar
Wajideu
Member
Member
Posts: 153
Joined: Wed Jul 30, 2014 1:05 am

Re: Waji's Standards

Post by Wajideu »

b.zaar wrote: If I recall correctly you didn't understand how make executes a rule by using the shell or that an ifeq ... $(error ...) macro executes immediately so atm I'm going to value someone else's opinion on how make does actually work.
My statement had nothing to do with not understanding how an if statement works. You're making up bullcrap. I stated that I was uncertain if make would handle having unfamiliar syntax after the if statement because from prior experience it seemed to me as though make did syntactical checking on the entire file before executing the code.
b.zaar wrote:I've been told that an alternative to the GNU tools and standard work practices is a waste of time.
Not sure what this has to do with what I said...
Wajideu wrote:What is the current value of $(VAR_A) at the exact time you use it?
I believe the variable remains unexpanded until it is used. ie. if you call 'VAR_A = $(VAL)', it will literally be '$(VAR)' until it is called with a := operator or it is passed to the program. This is just my assumption though. It would be best to check the manual.
b.zaar wrote:This is back to I write code in Basic to compile to pascal to compile to C to compile to assembly.
b.zaar wrote:If the utility was right why do we need autotools?
No, it's not. Do you consider going to the store to buy groceries part of actually cooking food? Of course not. Likewise, configuring and building; planning and making a recipe are two completely different things. When you try to mix the two you get a convoluted mess.
b.zaar wrote:I suggest you look up how nmake (the AT&T/Lucent version, I linked to it earlier) keeps track of changes. His idea was he didn't want things to be rebuilt because a file was touched. This idea works like git where if the file has no difference even though the file may have a later timestamp it is the same file. There's no need to update the git respository with no changes.
I'm not saying there aren't better ways, I'm just saying that the way make does it isn't a problem. Imho, I think a better technique would be to store a cache file that keeps tracks of file hashes. But what we're talking about here is a problem with how make is implemented, not with how it's designed.
User avatar
Brendan
Member
Member
Posts: 8561
Joined: Sat Jan 15, 2005 12:00 am
Location: At his keyboard!
Contact:

Re: Waji's Standards

Post by Brendan »

Hi,
Wajideu wrote:I'm not saying there aren't better ways, I'm just saying that the way make does it isn't a problem. Imho, I think a better technique would be to store a cache file that keeps tracks of file hashes. But what we're talking about here is a problem with how make is implemented, not with how it's designed.
Fill in the blank: A programmer creates a description of the executable they want (the source code), then _________ to get an executable.

In the ideal case, the programmer would either click a "compile" button or click a "compile with debugging" button. They would not learn 2 or more different languages (e.g. shell and make's language and auto-conf's language); and would not use a "low yield fork bomb" (auto-conf) to generate another fork bomb (make/makefiles) that controls a third fork bomb (many instances of compiler).

Make has 2 purposes:
  • It tells the compiler and linker (whose job it is to create an executable) how to create an executable; because the compiler and linker are too stupid to do their job without a hideously over-engineered baby-sitter telling them how.
  • It "optimises" the build process (e.g. by avoiding recompiling things that don't need to be recompiled) because the compiler and linker are too stupid to do this themselves.
Auto-tools also has 2 purposes:
  • To handle dependencies; which is something that package managers should do and shouldn't be considered part of the build process to begin with
  • To work around the fact that the "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards (which is effectively the same as having no standard)
You can not hide a pile of crap by burying it underneath another layer of crap - this only creates a larger pile of crap. If the compiler and linker are bad, fix the compiler and linker. If the build environment has no standards, pick one or invent one and enforce it.

If I call someone a "web developer", then often this is my way of suggesting that person lacks the competence needed to do anything efficiently. There are literally hundreds of languages/tools designed by "web developers" that are far superior to the "auto-conf, make, compile, link" mess. Anyone that can't even see the problem should be ashamed.


Cheers,

Brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.
User avatar
Muazzam
Member
Member
Posts: 543
Joined: Mon Jun 16, 2014 5:59 am
Location: Shahpur, Layyah, Pakistan

Re: Waji's Standards

Post by Muazzam »

Wajideu wrote: Well, there you have it. Hit me peeps, my body is ready. [-o<
Simply, We don't need your "Standards".
User avatar
b.zaar
Member
Member
Posts: 294
Joined: Wed May 21, 2008 4:33 am
Location: Mars MTC +6:00
Contact:

Re: Waji's Standards

Post by b.zaar »

Wajideu wrote:Do you consider going to the store to buy groceries part of actually cooking food? Of course not.
I do actually consider this part of preparing a whole meal, from reading a new recipe to shopping to chopping to cooking to serving. Not until it's on the plate is it a complete meal.
Brendan wrote:You can not hide a pile of crap by burying it underneath another layer of crap - this only creates a larger pile of crap. If the compiler and linker are bad, fix the compiler and linker. If the build environment has no standards, pick one or invent one and enforce it.
This is the exact reason make is not quite right and nothing built on top can fix it. You can't expect to fix the engine of a car by putting on a better coat of paint.

Btw my first attachment was an early test. I did include some unix checks in the autoconfig.mk file but somehow they didn't make it to the zip I uploaded. Here's the correct version. It should now check for Darwin, FreeBSD and Linux.
Attachments
autoconfig.zip
(2.34 KiB) Downloaded 87 times
"God! Not Unix" - Richard Stallman

Website: venom Dev
OS project: venom OS
Hexadecimal Editor: hexed
User avatar
Wajideu
Member
Member
Posts: 153
Joined: Wed Jul 30, 2014 1:05 am

Re: Waji's Standards

Post by Wajideu »

Brendan wrote:Make has 2 purposes:
  • It tells the compiler and linker (whose job it is to create an executable) how to create an executable; because the compiler and linker are too stupid to do their job without a hideously over-engineered baby-sitter telling them how.
  • It "optimises" the build process (e.g. by avoiding recompiling things that don't need to be recompiled) because the compiler and linker are too stupid to do this themselves.
Both of those are only partially correct. Make has 1 job, to use recipes to create targets based on a set of prerequisites. The fact that that the rules for building .c/.cpp files are implicitly specified is purely a perk of the way that it's implemented. You pointing fingers at the compiler and linker as being the source of the problem somewhat disturbs me, but again that's something left to the implementation so debating our personal preferences will get us nowhere.

All I'll say is that these tools were written in an era where MMU's weren't in every pc. The programs had to have a very small memory footprint, so Unix developers broke down the tasks that needed to be accomplished into the smallest possible implementation and relied on the use of pipes. This can be considered outdated, but at least it's aged well.

Brendan wrote:Auto-tools also has 2 purposes:
  • To handle dependencies; which is something that package managers should do and shouldn't be considered part of the build process to begin with
  • To work around the fact that the "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards (which is effectively the same as having no standard)
Nope. Autotools has 8 jobs:
  • Detecting the platform of the host and target systems; including the architecture, machine, and subsystem.
  • Detecting the programs available on the host; (eg. detecting if it should use gcc or lcc as the c compiler) and ensuring that each of these is functioning properly
  • Detecting the headers and libraries available on the system
  • Detecting the functions and type definitions listed within headers and libraries to determine if the ones that the host has installed are capable of being used
  • Providing an interface to configure the project with specific options and features; eg. "--enable-language=c,c++", "--with-headers=", etc.
  • Generating a header containing important package information and specific information detected by the configure process to be used by the pre-processor to allow platform-specific configuration and optimization
  • To provide an easy system of building and distributing packages. (ie. "./configure; make all distcheck" would configure and build the package, then bundle it into a tarball for distribution)
  • To provide a standardized way of managing open source projects. Each project usually has a specific set of files such as ChangeLog, AUTHORS, INSTALL, BUGS, README, and LICENSE files which respectively 1) keep track of changes since the last distributions, so if the current distribution has broken something you can roll back to an older distribution 2) keep track of the contributors to the project 3) Provide installation instructions for the package 4) keep track of bugs that need to be fixed, 5) provide basic information about the project, and 6) provide licensing information essential to developers who may wish to fork or distribute the project
The 8th step is the one that bugs a lot of people, because not everyone is fond of how GNU manages it's packages.


Brendan wrote:You can not hide a pile of crap by burying it underneath another layer of crap - this only creates a larger pile of crap. If the compiler and linker are bad, fix the compiler and linker. If the build environment has no standards, pick one or invent one and enforce it.
You're not burying anything. As I stated before, most people who dislike Autotools just don't have a clear understanding of what it does. Simply saying, "it makes makefiles" is a huge understatement.


muzzuam wrote:Simply, We don't need your "Standards".
Maybe not, but that isn't going to stop me from posting my ideas.
User avatar
Brendan
Member
Member
Posts: 8561
Joined: Sat Jan 15, 2005 12:00 am
Location: At his keyboard!
Contact:

Re: Waji's Standards

Post by Brendan »

Hi,
Wajideu wrote:
Brendan wrote:Make has 2 purposes:
  • It tells the compiler and linker (whose job it is to create an executable) how to create an executable; because the compiler and linker are too stupid to do their job without a hideously over-engineered baby-sitter telling them how.
  • It "optimises" the build process (e.g. by avoiding recompiling things that don't need to be recompiled) because the compiler and linker are too stupid to do this themselves.
Both of those are only partially correct. Make has 1 job, to use recipes to create targets based on a set of prerequisites. The fact that that the rules for building .c/.cpp files are implicitly specified is purely a perk of the way that it's implemented.
You're saying "Make has 1 job, to use recipes to create targets based on a set of prerequisites (and this job has 2 purposes, telling the compiler/linker how and optimising the build process)".
Wajideu wrote:
Brendan wrote:Auto-tools also has 2 purposes:
  • To handle dependencies; which is something that package managers should do and shouldn't be considered part of the build process to begin with
  • To work around the fact that the "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards (which is effectively the same as having no standard)
Nope. Autotools has 8 jobs:
  • Detecting the platform of the host and target systems; including the architecture, machine, and subsystem.
Which is necessary because the "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards (e.g. where differences between platforms/targets is not abstracted adequately via. things like the C/C++ standard library).
Wajideu wrote:
  • Detecting the programs available on the host; (eg. detecting if it should use gcc or lcc as the c compiler) and ensuring that each of these is functioning properly.
Which is necessary because the "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards (e.g. where you can't just have a standard environment variable saying the name of the C compiler or even expect the C compiler to function properly).
Wajideu wrote:
  • Detecting the headers and libraries available on the system
  • Detecting the functions and type definitions listed within headers and libraries to determine if the ones that the host has installed are capable of being used
Which are necessary because either:
  • You failed to use something to handle non-standard dependencies (e.g. package manager; which is a new but unrelated "lack of standard package format" cluster-bork all of its own); or
  • The "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards (e.g. where you can't assume the language's "standard library" is standard).
Wajideu wrote:
  • Providing an interface to configure the project with specific options and features; eg. "--enable-language=c,c++", "--with-headers=", etc.
This is mixing 2 things. For configuring the project because the "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards, see previous comments.

For configuring the project because the project itself is a massive cluster-bork of fail (e.g. compile time options where run-time options were needed or the project failed to define a standard for the final executable's behaviour) you're right, in that (something like) auto-conf is needed to allow incompetent fools to spread the disease further.
Wajideu wrote:
  • Generating a header containing important package information and specific information detected by the configure process to be used by the pre-processor to allow platform-specific configuration and optimization
Which is necessary because the "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards (e.g. where you actually need to abuse the pre-processor to work-around the complete and utter failure of the tools to abstract differences between platforms).
Wajideu wrote:
  • To provide an easy system of building and distributing packages. (ie. "./configure; make all distcheck" would configure and build the package, then bundle it into a tarball for distribution)
Which is necessary because the "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards (e.g. where there is no standard way to generate a package).
Wajideu wrote:
  • To provide a standardized way of managing open source projects. Each project usually has a specific set of files such as ChangeLog, AUTHORS, INSTALL, BUGS, README, and LICENSE files which respectively 1) keep track of changes since the last distributions, so if the current distribution has broken something you can roll back to an older distribution 2) keep track of the contributors to the project 3) Provide installation instructions for the package 4) keep track of bugs that need to be fixed, 5) provide basic information about the project, and 6) provide licensing information essential to developers who may wish to fork or distribute the project
You mean there's 6 files that get included in a tarball that are ignored by every other part of the build process? Oh my - we're going to need a team of 12 "over-engineers" working around the clock for the next six years handle a massively complex requirement like that (but don't worry, I'm sure we can recover the research and development expenses when we get our "method for including a thing in another thing" patent)!


Cheers,

Brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.
User avatar
Wajideu
Member
Member
Posts: 153
Joined: Wed Jul 30, 2014 1:05 am

Re: Waji's Standards

Post by Wajideu »

Brendan wrote:You're saying "Make has 1 job, to use recipes to create targets based on a set of prerequisites (and this job has 2 purposes, telling the compiler/linker how and optimising the build process)".
It has nothing to do with the compiler/linker specifically; and it doesn't optimize the build process, it just rebuilds when the prerequisites have changed.
Brendan wrote:
Wajideu wrote:Nope. Autotools has 8 jobs:
  • Detecting the platform of the host and target systems; including the architecture, machine, and subsystem.
Which is necessary because the "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards (e.g. where differences between platforms/targets is not abstracted adequately via. things like the C/C++ standard library).
No, it's necessary so the code can be configured specifically for the platform. ie. If your target architecture is "i586", you can use MMX extensions to boost the speed of matrix math. Or if you're writing an emulator and you detect that both the host and target architectures are the same, you can skip most of the dynamic recompiling stage.
Brendan wrote:
Wajideu wrote:
  • Detecting the programs available on the host; (eg. detecting if it should use gcc or lcc as the c compiler) and ensuring that each of these is functioning properly.
Which is necessary because the "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards (e.g. where you can't just have a standard environment variable saying the name of the C compiler or even expect the C compiler to function properly).
Nope. It's necessary because not everyone uses the same build environment.
Brendan wrote:
Wajideu wrote:
  • Detecting the headers and libraries available on the system
  • Detecting the functions and type definitions listed within headers and libraries to determine if the ones that the host has installed are capable of being used
Which are necessary because either:
  • You failed to use something to handle non-standard dependencies (e.g. package manager; which is a new but unrelated "lack of standard package format" cluster-bork all of its own); or
  • The "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards (e.g. where you can't assume the language's "standard library" is standard).
Nope. Pack manager has nothing to do with it, there are often differences between versions of libraries or software that make them incompatible. For example, several programs (including part of the autotools project itself) have changed so much since their initial version, that in order to provide backwards compatibility they release several different versions of the same program and provide a wrapper around it. Additionally, considering changes between coding standards. C99 introduces new features like generic overloading and atomic types, and Embedded C introduces accumulative and fractional fixed point and hardware io standardization. However, neither of these are available in C89 or pre-standard C. Additionally, not all target platforms support those features. ie. GCC supports Embedded-C fixed point types, but if you try to use them to compile an x86 program, it'll fail because the architecture itself doesn't support them natively.
Brendan wrote:
Wajideu wrote:
  • Providing an interface to configure the project with specific options and features; eg. "--enable-language=c,c++", "--with-headers=", etc.
This is mixing 2 things. For configuring the project because the "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards, see previous comments.

For configuring the project because the project itself is a massive cluster-bork of fail (e.g. compile time options where run-time options were needed or the project failed to define a standard for the final executable's behaviour) you're right, in that (something like) auto-conf is needed to allow incompetent fools to spread the disease further.
No, it's you trying to mix 2 things. This is why it's separate from make. Configuring and making are 2 completely different things.
Brendan wrote:
Wajideu wrote:
  • Generating a header containing important package information and specific information detected by the configure process to be used by the pre-processor to allow platform-specific configuration and optimization
Which is necessary because the "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards (e.g. where you actually need to abuse the pre-processor to work-around the complete and utter failure of the tools to abstract differences between platforms).
Again, no. It's necessary in order to provide machine specific optimization and to allow the inclusion of meta-information into the binary. ie.

Code: Select all

printf ("This version of " PACKAGE_NAME " is " PACKAGE_VERSION "\n");
# if TARGET_ARCH == i386
printf ("this is optimized for i386 processors\n");
# else
printf ("this is unoptimized\n");
# endif

~$ test
This version of MyPackage is 1.0.0
this is optimized for i386 processors
Brendan wrote:
Wajideu wrote:
  • To provide an easy system of building and distributing packages. (ie. "./configure; make all distcheck" would configure and build the package, then bundle it into a tarball for distribution)
Which is necessary because the "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards (e.g. where there is no standard way to generate a package).
Agreed. But now we're talking about a problem with Unix/Linux, and no one wants to fix it. When someone else brings up the idea of fixing it, they get bashed and talked down to, much like what happened about 3 or so pages ago in this very topic.
Brendan wrote:
Wajideu wrote:
  • To provide a standardized way of managing open source projects. Each project usually has a specific set of files such as ChangeLog, AUTHORS, INSTALL, BUGS, README, and LICENSE files which respectively 1) keep track of changes since the last distributions, so if the current distribution has broken something you can roll back to an older distribution 2) keep track of the contributors to the project 3) Provide installation instructions for the package 4) keep track of bugs that need to be fixed, 5) provide basic information about the project, and 6) provide licensing information essential to developers who may wish to fork or distribute the project
You mean there's 6 files that get included in a tarball that are ignored by every other part of the build process? Oh my - we're going to need a team of 12 "over-engineers" working around the clock for the next six years handle a massively complex requirement like that (but don't worry, I'm sure we can recover the research and development expenses when we get our "method for including a thing in another thing" patent)!
Because they're not part of the build process, they're information for the developers and users. And as I stated before, not everyone is happy about them being there.


There are many reasons why we need autotools, but many problems faced with using it. That's why I want to write a utility that fixes those quirks.
User avatar
Candy
Member
Member
Posts: 3882
Joined: Tue Oct 17, 2006 11:33 pm
Location: Eindhoven

Re: Waji's Standards

Post by Candy »

I'd just like to thank you for this example:
if you're writing an emulator and you detect that both the host and target architectures are the same, you can skip most of the dynamic recompiling stage
as it is the first thing I would not expect to be part of your code or makefile, so to me it is the first thing that legitimizes a configure step.
User avatar
b.zaar
Member
Member
Posts: 294
Joined: Wed May 21, 2008 4:33 am
Location: Mars MTC +6:00
Contact:

Re: Waji's Standards

Post by b.zaar »

Candy wrote:I'd just like to thank you for this example:
if you're writing an emulator and you detect that both the host and target architectures are the same, you can skip most of the dynamic recompiling stage
as it is the first thing I would not expect to be part of your code or makefile, so to me it is the first thing that legitimizes a configure step.
What makes this impossible to detect using make?
"God! Not Unix" - Richard Stallman

Website: venom Dev
OS project: venom OS
Hexadecimal Editor: hexed
User avatar
Wajideu
Member
Member
Posts: 153
Joined: Wed Jul 30, 2014 1:05 am

Re: Waji's Standards

Post by Wajideu »

b.zaar wrote:
Candy wrote:I'd just like to thank you for this example:
if you're writing an emulator and you detect that both the host and target architectures are the same, you can skip most of the dynamic recompiling stage
as it is the first thing I would not expect to be part of your code or makefile, so to me it is the first thing that legitimizes a configure step.
What makes this impossible to detect using make?
How about providing us with an example as to how you can do it in make? Detecting both the architecture and subsystem for the host, target (, and optional build) system. Then show how you'd use that information to select which files should be built and how you'd pass that information to your source code.

It's just not feasible or practical in any way. No matter how you look at it, you need to have some other tool to configure this sort of thing.
User avatar
Brendan
Member
Member
Posts: 8561
Joined: Sat Jan 15, 2005 12:00 am
Location: At his keyboard!
Contact:

Re: Waji's Standards

Post by Brendan »

Hi,
Wajideu wrote:
Brendan wrote:You're saying "Make has 1 job, to use recipes to create targets based on a set of prerequisites (and this job has 2 purposes, telling the compiler/linker how and optimising the build process)".
It has nothing to do with the compiler/linker specifically; and it doesn't optimize the build process, it just rebuilds when the prerequisites have changed.
Skipping unnecessary work is an optimisation. Compare it to a simple script that always rebuilds everything.
Wajideu wrote:
Brendan wrote:
Wajideu wrote:Nope. Autotools has 8 jobs:
  • Detecting the platform of the host and target systems; including the architecture, machine, and subsystem.
Which is necessary because the "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards (e.g. where differences between platforms/targets is not abstracted adequately via. things like the C/C++ standard library).
No, it's necessary so the code can be configured specifically for the platform. ie. If your target architecture is "i586", you can use MMX extensions to boost the speed of matrix math. Or if you're writing an emulator and you detect that both the host and target architectures are the same, you can skip most of the dynamic recompiling stage.
For someone compiling for themselves (where target architecture is the architecture the compiler is running on) the compiler can auto-detect the target. For compiling for other people (e.g. for a pre-compiled binary only package) you want "all targets" but this isn't very practical so you end up selecting a few common subsets (e.g. 32-bit 80x86 without any features, 32-bit 80x86 with MMX/SSE, and 64-bit 80x86) and most people end up running code that isn't optimised for their system.

In both of these cases, detecting the target in make solves nothing.

Note: A better approach is to use 2 compilers; such that the first compiler does as much as sanity checking and optimisation as possible and generates some form of portable intermediate code (where this compiler is used as part of the build process), and the second compiler compiles the portable intermediate code into native code optimised specifically for the end user's specific computer when the package is installed by the end user. This solves all of the problems, including the "people end up running code that isn't optimised for their system", and means that the build process simply doesn't care what the target architecture is. Also note that this is a large part of why something like Java (where there is a JIT compiler optimising for the specific target machine) can be as efficient as C++ (where you can't optimise for the end-user's specific machine) despite the additional overhead of "JIT-ting". Sadly, traditional tools are traditionally bad.
Wajideu wrote:
Brendan wrote:
Wajideu wrote:
  • Detecting the programs available on the host; (eg. detecting if it should use gcc or lcc as the c compiler) and ensuring that each of these is functioning properly.
Which is necessary because the "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards (e.g. where you can't just have a standard environment variable saying the name of the C compiler or even expect the C compiler to function properly).
Nope. It's necessary because not everyone uses the same build environment.
Your words say "Nope", but the meaning behind those words says "Yes, I agree completely, there is no standard build environment and therefore not everyone uses a compatible build environment".
Wajideu wrote:
Brendan wrote:
Wajideu wrote:
  • Detecting the headers and libraries available on the system
  • Detecting the functions and type definitions listed within headers and libraries to determine if the ones that the host has installed are capable of being used
Which are necessary because either:
  • You failed to use something to handle non-standard dependencies (e.g. package manager; which is a new but unrelated "lack of standard package format" cluster-bork all of its own); or
  • The "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards (e.g. where you can't assume the language's "standard library" is standard).
Nope. Pack manager has nothing to do with it, there are often differences between versions of libraries or software that make them incompatible. For example, several programs (including part of the autotools project itself) have changed so much since their initial version, that in order to provide backwards compatibility they release several different versions of the same program and provide a wrapper around it. Additionally, considering changes between coding standards. C99 introduces new features like generic overloading and atomic types, and Embedded C introduces accumulative and fractional fixed point and hardware io standardization. However, neither of these are available in C89 or pre-standard C. Additionally, not all target platforms support those features. ie. GCC supports Embedded-C fixed point types, but if you try to use them to compile an x86 program, it'll fail because the architecture itself doesn't support them natively.
You're agreeing with me again (saying there's either no standards or multiple standards for libraries, languages, etc).
Wajideu wrote:
Brendan wrote:
Wajideu wrote:
  • Providing an interface to configure the project with specific options and features; eg. "--enable-language=c,c++", "--with-headers=", etc.
This is mixing 2 things. For configuring the project because the "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards, see previous comments.

For configuring the project because the project itself is a massive cluster-bork of fail (e.g. compile time options where run-time options were needed or the project failed to define a standard for the final executable's behaviour) you're right, in that (something like) auto-conf is needed to allow incompetent fools to spread the disease further.
No, it's you trying to mix 2 things.
Can you provide an example of where an interface to configure the project is necessary; which is not a case of bad build environment and not a case of bad project? Please note that it would be in your interest to use some foresight here - e.g. for any potential example, attempt to predict my response; and attempt to avoid providing an example where it's easy to find a way to improve the language/tools/build environment instead, and attempt to avoid providing an example where it shouldn't have been run-time option.
Wajideu wrote:
Brendan wrote:
Wajideu wrote:
  • Generating a header containing important package information and specific information detected by the configure process to be used by the pre-processor to allow platform-specific configuration and optimization
Which is necessary because the "build environment" is a massive cluster-bork of fail where virtually everything either has no standard or multiple standards (e.g. where you actually need to abuse the pre-processor to work-around the complete and utter failure of the tools to abstract differences between platforms).
Again, no. It's necessary in order to provide machine specific optimization and to allow the inclusion of meta-information into the binary. ie.

Code: Select all

printf ("This version of " PACKAGE_NAME " is " PACKAGE_VERSION "\n");
# if TARGET_ARCH == i386
printf ("this is optimized for i386 processors\n");
# else
printf ("this is unoptimized\n");
# endif

~$ test
This version of MyPackage is 1.0.0
this is optimized for i386 processors
Where does PACKAGE_NAME and PACKAGE_VERSION come from? More specifically, how does either auto-conf or make automatically guess both the package name and package version correctly and prevent the need for programmers to explicitly set them somewhere (e.g. in a header file)?

Are you saying that the C/C++ language specifications have both failed to include a standard "__TARGET_ARCH__" pre-processor macro?
Wajideu wrote:There are many reasons why we need autotools, but many problems faced with using it. That's why I want to write a utility that fixes those quirks.
Please repeat this daily: "Brendan is right; the languages, tools and build environment we're using are all a complete cluster-bork of fail; however (for whatever reason) I don't want to fix any of the many problems created by poor languages, poor tools and poor build environments; and I only want a "slightly less bad" work-around that fails to fix any of the many problems."

Note that I am being an idealist, and in practice there are many valid reasons failing to fix any of the many problems (e.g. lacking the skills, lacking the time, needing it to be compatible with extremely bad OSs like Linux, etc). There is no shame in admitting that you are knowingly failing to fix any of the many problems for a valid reason.


Cheers,

Brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.
User avatar
b.zaar
Member
Member
Posts: 294
Joined: Wed May 21, 2008 4:33 am
Location: Mars MTC +6:00
Contact:

Re: Waji's Standards

Post by b.zaar »

Wajideu wrote:It's just not feasible or practical in any way. No matter how you look at it, you need to have some other tool to configure this sort of thing.
I think you have a misunderstanding about the tools again. Autotools takes an input file and creates the shell script configure. The configure shell script then detects and creates the makefile.

The tool make works with the shell and could also run the same checks configure does. I'm not going to rewrite autotools in make or write make++ in shell script for the example but have a look at the attachment.

To use it run

Code: Select all

./make++ init
which creates a new project with the default Makefile and configure.mk just like a git init would initialize a git respository.

Now you can run the 3 commands

Code: Select all

./make++
./make++ config
./make++ clean
This useless little example has been created by a lowly Windows user, too simple to use autotools...
Attachments
make++.tar.gz
(15.03 KiB) Downloaded 96 times
"God! Not Unix" - Richard Stallman

Website: venom Dev
OS project: venom OS
Hexadecimal Editor: hexed
Post Reply