▲ | somat 17 hours ago | |
C does as well, that is half the point of make. When building a large C project it will first create a bunch of object files from the source files then link them into an executable. make then keeps track of what source files have changed and rebuilds only those object files. The first build is slow, subsequent builds are much faster. only needing to compile one file then link them. At least that's the theory, in reality make has a lot of warts and implementing a good solid make file is an art. Don't even get me started on the horrors of automake, perhaps I just need to use it in one of my own projects but as someone who primarily ports others code, I hate it with a passion. It is so much easier when a project just sticks with a hand crafted makefile. For completeness: The other half of make is to implement the rest of the build process. | ||
▲ | uecker 15 hours ago | parent | next [-] | |
I would say autoconf/automake is not really useful anymore and probably somebody should just establish a new and simplified standardized setup and Makefile for C projects. And yes, efficient separate and incremental compilation is major advantage of C. I do not understand why people criticize this. It works beautifully. I also think it is good that the language and build system are separate. | ||
▲ | badsectoracula 13 hours ago | parent | prev [-] | |
I think you probably refer to something else than what i meant in my post above. Borland C++ had the compiler as part of the IDE (there was also a separate command-line version, but it was also compiled as part of the IDE). This allowed the IDE to not spawn separate processes for each file nor even need to hit the disk - the compiler (which was already in RAM as part of the IDE's process) would read the source code from the editor's buffer (instead of a file, so again, no hitting the disk) and would also keep a bunch of other stuff in memory between builds instead of reading it. This approach allows the compiler to reuse data not only between builds but also between files of the same build. Meanwhile make is just a program launcher, the program - the compiler - need to run for each file and load and parse everything it needs to work for every single source file it needs to compile, thus rebuilding and destroying its entire universe for each file separately. There is no reuse here - even when you use precompiled headers to speed up some things (which is something Borland C++ also supported and it did speed up things even more on an already fast system), the compiler still needs to build and destroy that universe. It is not a coincidence that one of the ways nowadays to speed up compilation of large codebases is unity builds[0] which essentially combine multiple C/C++ files (the files need to be aware of it to avoid one file "polluting" the contents of another) to allow multiple compilation units reuse/share the compilation state (such as common header files) with a single compiler instance. E.g. it is a core feature of FASTbuild[1] which combines distributed builds, caching and unity builds. Of course Borland C++'s approach wasn't perfect as it had to run with limited memory too (so it still had to hit the disk at some point - note though that the Pascal compilers could do everything in memory, including even the final linking, even the program could remain in memory). Also bugs in the compiler could linger, e.g. i remember having to restart Borland C++ Builder sometimes every few hours of using it because the compiler was confused about something and had cached it in memory between builds. Also Free Pascal's text mode IDE (shown in the article) has the Free Pascal compiler as part of the IDE itself, but in the last release (i think) there is a memory leak and the IDE's use keeps increasing little by little every time you build, which is something that wouldn't matter with a separate program (and most people use FPC as a separate program via Lazarus these days, which is most likely why nobody noticed the leak). |