| |
| ▲ | duped 5 days ago | parent [-] | | Consider that to do this you must: - Use a build system like make, you can't just `c++ build` - Understand that C++ compilers by default have no idea where most things are, you have to tell them exactly where to search - Use an external tool that's not your build system or compiler to actually inform the compiler what those search paths are - Oh also understand the compiler doesn't actually output what you want, you also need a linker - That linker also doesn't know where to find things, so you need the external tool to use it - Oh and you still have to use a package manager to install those dependencies to work with pkg-config, and it will install them globally. If you want to use it in different projects you better hope you're ok with them all sharing the same version. Now you can see why things like IDEs became default tools for teaching students how to write C and C++, because there's no "open a text editor and then `c++ build file.cpp` to get output" for anything except hello world examples. | | |
| ▲ | jlarocco 5 days ago | parent | next [-] | | It's really not that big of a deal once you know how it works, and there are tools like CMake and IDEs that will take care of it. On Windows and OSX it's even easier - if you're okay writing only for those platforms. It's more difficult to learn, and it seems convoluted for people coming from Python and Javascript, but there are a lot of advantages to not having package management and build tooling tightly integrated with the language or compiler, too. | | |
| ▲ | saghm 4 days ago | parent | next [-] | | "It's not really that big a deal once you know how it works" is the case with pretty much everything though. The question is whether the amount of time needed to learn how something works is worthwhile though, and the sheer number of things you need to invest the time to learn in a language like C++ compared to more modern languages is a big deal. Looking at a single one of them in isolation like the build system essentially just zooms in one problem far enough to remove the other ones from the picture. | |
| ▲ | scrubs 4 days ago | parent | prev | next [-] | | I agree -- I've been at it long enough -- cmake etc makes stuff pretty darn easy. But in industrial settings where multi groups share and change libs something like debpkg may be used. You add caching and you can go quite deep quickly esp after bolting on cdci. One must cop to the fact that a go build or zig build is just fundamentally better. | | |
| ▲ | rapidlua 4 days ago | parent | next [-] | | Go build is fundamentally better? How so? Go build is so light on features that adding generated files to source control is a norm in go land. | | |
| ▲ | scrubs 4 days ago | parent [-] | | Generated files are noise. Newer languages builds have built in version resolution to resolve dependencies together with smarter ways to reference dependencies without #include. And that's better |
| |
| ▲ | jlarocco 4 days ago | parent | prev [-] | | Yeah, I definitely agree the newer tools are better, but sometimes the arguments against C++ get blown out of proportion. It definitely has a lot of flaws, but in practice most of them have solutions or workarounds, and on a day-to-day basis most C++ programmers aren't struggling with this stuff. |
| |
| ▲ | duped 4 days ago | parent | prev | next [-] | | So I come from the C/C++ world, that's part of why I disagree with these takes. I wouldn't say any process involving CMake is "not that big of a deal" because I routinely see veteran developers struggle to edit cmake files to get their code to compile and link. | |
| ▲ | einpoklum 4 days ago | parent | prev | next [-] | | It is a big deal even after you know how it works. The thing is, the languages like Rust only make this easier within their controlled "garden". But for C and C++, you build in the "world outside the garden" to begin with, where you are not guaranteed of everyone having prepared everything for you. So, it's harder, and you may need third-party tools or putting in some elbow grease, or both. The upside is that when rustaceans or go-phers and such wander outside their respective gardens, most of them are completely lost and have no idea what to do; but C and C++ people are kinda-sorta at home there, already. | | |
| ▲ | steveklabnik 4 days ago | parent [-] | | What is "outside the garden" for Rust? | | |
| ▲ | einpoklum 4 days ago | parent [-] | | Oh, say, use some binary C library with some header, that you found on some system. | | |
| ▲ | hiimkeks 4 days ago | parent [-] | | That shouldn't be too tricky, assuming the binary is built for the sort of device you want to run on. At least not much more complicated than calling any other C code, using bindgen. | | |
| ▲ | steveklabnik 3 days ago | parent [-] | | Yep, Rust made design decisions to make this case as zero overhead as it can be. |
|
|
|
| |
| ▲ | Defletter 5 days ago | parent | prev | next [-] | | This is pure Stockholm syndrome. If I were forced to choose between creating a cross-platform C++ project from scratch or taking an honest to god arrow to the knee, the arrow would be less painful. | | |
| ▲ | AlienRobot 4 days ago | parent | next [-] | | I don't want any arrows in my knees but I agree. The main reason I don't want to use C/C++ are the header files. You have to write everything in a header file and then in an implementation file. Every time you want to change a function you need to do this at least twice. And you don't even get fast compilation speed compared to some languages because your headers will #include some library that is immense and then every header that includes that header will have transitive header dependencies, and to solve this you use precompiled headers which you might have to set up manually dependending on what IDE you are using. It's all too painful. | | |
| ▲ | jstimpfle 4 days ago | parent [-] | | It gets better with experience. You can have a minimal base layer of common but rarely changing functionality. You can reduce static inline functions in headers. You reduce data structure definitions, but put only forward declarations in header files. (Don't use C++ methods, at least don't put them in an API, because they force you to expose your implementation details needlessly). You can separate data structures from functions in different header files. Grouping functions together with types is often a bad idea since most useful functionality combines data from two or more "unrelated" types -- so you'd rather make function headers "by topic" than putting them alongside types. I just created a subsystem for a performance intensive application -- a caching layer for millions or even billions of objects. The implementation encompasses over a 1000 LOC, but the header only includes <stdint.h>. There are about 5 forward struct declarations and maybe a dozen function calls in that API. To a degree it might be stockholm syndrome, but I feel like having had to work around a lot of C's shortcomings I actually learned quite a lot that helps me in architecting bigger systems now. Turns out a lot of the flexibility and ease that you get from more modern languages mostly allows you to code more sloppily, but being sloppy only works for smaller systems. |
| |
| ▲ | einpoklum 4 days ago | parent | prev | next [-] | | If you were forced to choose between creating a cross-platform project in one of the trendy language, but of course, which must also work on tiny hardware with a weird custom OSes on some hobbyist hardware, and with 30-year-old machines in some large organization's server farm - then you would choose the C++ project, since you will be able to make that happen, with some pain. And with the other languages - you'll probably just give up or need to re-develop all userspace for a bunch of platforms, so that it can accommodate the trendy language build tool. And even that might not be enough. Also: If you are on platforms which support, say, CMake - then the multi-platform C++ project is not even that painful. | | |
| ▲ | nxobject 4 days ago | parent [-] | | > but of course, which must also work on tiny hardware with a weird custom OSes on some hobbyist hardware, and with 30-year-old machines in some large organization's server farm - then you would choose the C++ projectt, since you will be able to make that happen, with some pain. With the old and proprietary toolchains involved, I would bet dollars to doughnuts that there's a 50% odds of C++11 being the latest supported standard. In that context, modern C++ is the trendy language. |
| |
| ▲ | CyberDildonics 5 days ago | parent | prev [-] | | Why? There are lots of cross platform libraries and most aspects are not platform specific. It's really not a big deal. Use FLTK and you get most of the cross platform stuff for free in a small package. |
| |
| ▲ | imtringued 5 days ago | parent | prev [-] | | I used to write a lot of C++ in 2017. Now in 2025 I have no memory of how to do that anymore. It's bespoke Makefile nonsense with zero hope of standardization. It's definitively something that doesn't grow with experience. Meanwhile my gradle setups have been almost unchanged since that time if it wasn't for the stupid backwards incompatible gradle releases. | | |
| ▲ | gpderetta 4 days ago | parent | next [-] | | > It's bespoke Makefile nonsense with zero hope of standardization technically Makefile is standardized (by POSIX), contrary to most alternatives. /extremely pedantic | |
| ▲ | pjmlp 4 days ago | parent | prev | next [-] | | I would rather deal with Makefiles than Gradle. | | |
| ▲ | saghm 4 days ago | parent [-] | | I think we can afford to strive for more than just "not quite the absolute worst" (for however we decide to measure quality). |
| |
| ▲ | einpoklum 4 days ago | parent | prev [-] | | > I used to write a lot of C++ in 2017... It's bespoke Makefile nonsense 1. Makefiles are for build systems; they are not C++.
2. Even for building C++ - in 2017, there was no need to write bespoke Makefiles, or any Makefiles. You could, and should, have written CMake; and your CMake files would be usable and relevant today. > Meanwhile my gradle setups have been almost unchanged since that time ... but, typically, with far narrower applicability. | | |
| ▲ | feffe 4 days ago | parent [-] | | CMake has become the defacto standard in many ways, but I don't think it's that easy to deal with. There's often some custom support code in a project (just as with make files) that you need to learn the intricacies of, and also external 3pp modules that solve particular integration issues with building software that you also need to learn. For me, base CMake is pretty easy by now, but I'd rather troubleshoot a makefile than some obscure 3pp CMake module that doesn't do what I want. Plain old makefiles are very hackable for better or worse [1]. It's easy to solve problems with make (in bespoke ways), and at the same time this is the big issue, causing lots of custom solutions of varying correctness. [1]: Make is easy the same way C is easy. | | |
| ▲ | einpoklum 4 days ago | parent [-] | | I didn't say "easy to deal with", I said it's not bespoke nonsense, and that you could keep it mostly unchanged today, 8 years later. Plus - the "obscure third party modules" have been getting less obscure and more standard-ish. In 2017 it was already not bad, today it's better. |
|
|
|
| |
| ▲ | account42 5 days ago | parent | prev | next [-] | | > Use a build system like make, you can't just `c++ build` This is a strength not a weakness because it allows you to choose your build system independently of the language. It also means that you get build systems that can support compiling complex projects using multiple programming languages. > Understand that C++ compilers by default have no idea where most things are, you have to tell them exactly where to search This is a strength not a weakness because it allows you to organize your dependencies and their locations on your computer however you want and are not bound by whatever your language designer wants. > Use an external tool that's not your build system or compiler to actually inform the compiler what those search paths are This is a strength not a weakness because you are not bound to a particular way of how this should work. > Oh also understand the compiler doesn't actually output what you want, you also need a linker This is a strength not a weakness because now you can link together parts written in different programming languages which allows you to reuse good code instead of reinventing the universe. > That linker also doesn't know where to find things, so you need the external tool to use it This is a strength not a weakness for the reasons already mentioned above. > Oh and you still have to use a package manager to install those dependencies to work with pkg-config, and it will install them globally. If you want to use it in different projects you better hope you're ok with them all sharing the same version. This is a strength not a weakness because you can have fully offline builds including ways to distribute dependencies to air-gapped systems and are not reliant on one specific online service to do your job. Also all of this is a non-issue if you use a half-modern build system. Conflating the language, compiler, build system and package manager is one of the main reason why I stay away from "modern" programming languages. You are basically arguing against the Unix philosophy of having different tools that work together with each tool focusing on one specific task. This allows different tools to evolve independently and for alternatives to exist rather than a single tool that has to fit everyone. | | |
| ▲ | xyzzy123 2 days ago | parent | next [-] | | You can't build things without being part of a culture and understanding the tools that culture uses. | |
| ▲ | juliangmp 5 days ago | parent | prev [-] | | > This is a strength not a weakness Massive cope, there's no excuse for the lack of decent infrastructure. I mean, the C++ committee for years said explicitly that they don't care about infrastructure and build systems, so it's not really surprising. | | |
| ▲ | gpderetta 4 days ago | parent [-] | | The reality is that for any moderately complex C++ application, actually compiling C++ code is only a small part of what the build system does. | | |
| ▲ | raverbashing 4 days ago | parent [-] | | Well yeah We have autoconf/automake checking if you're on a big endian PDP8 or if your compiler has support for cutting edge features like "bool" |
|
|
| |
| ▲ | palata 5 days ago | parent | prev | next [-] | | I can use pkg-config just fine. Not sure how relevant the "in order to use a tool, you need to learn how to use the tool". Or from the other side: not sure what I should think about the quality of the work produced by people who don't want to learn relatively basic skills... it does not take two PhDs to understand how to use pkg-config. | | |
| ▲ | duped 5 days ago | parent [-] | | I'm just pointing out that one reason devex sucks in C++ is because the fact you need a wide array of tools, that are non portable, and require learning and teaching magic incantations at the command line or in build scripts to work, doesn't foster what one could call a "good" experience. Frankly the idea that your compiler driver should not be a basic build system, package manager, and linker is an idea best left in the 80s where it belongs. | | |
| ▲ | menaerus 5 days ago | parent | next [-] | | For most people this is a feature not a bug as you suggest. It may come across as PITA, and for many people will do, but as far as I am concerned, while also having experienced the pain of package managers in C++, this is the right way. In the end it's always about the trade-offs. And all the (large) codebases that used conan, bazel or vcpkg induced a magnitude more issues that you would have to handle which otherwise in a plain CMake you would not have. Package managers are for convenience but not all projects can afford themselves the trouble this convenience brings with it. | | |
| ▲ | soanvig 4 days ago | parent | next [-] | | Coming from a different ground (TypeScript) I agree, in a sense that there is a line where apparent convenience because a trouble. JS ecosystem is known for its hype for build tools. Long term all of them become a problem due to trying to be more convenient, leading to more and more abstractions and hidden behaviors, which turns into a mess impossible to debug or solve when user diverges from author's happy path. Thus I promote using only the necessities, and gluing them together by yourself. Even if something doesn't work, at least it can be tracked down and solved. | |
| ▲ | palata 4 days ago | parent | prev [-] | | > For most people this is a feature not a bug as you suggest. Exactly: it makes many things nicer to use than the language package managers, e.g. when maintaining a Linux distribution. But people generally don't know how one maintains a Linux distribution, so they can't really see the use-case, I guess. |
| |
| ▲ | palata 4 days ago | parent | prev | next [-] | | > require learning and teaching magic incantations at the command line That's exactly my point: if you think that calling `cmake --build build` is "magic", then maybe you don't have the right profile to use C++ in the first place, because you will have to learn some harder concepts there (like... pointers). To be honest, I find it hard to understand how a software developer can write code and still consider that command line instructions are "magic incantations". To me it's like saying that calling a function like `println("Some text, {}, {}", some_parameter, some_other_parameter)` is a "magic incantation". Calling a function with parameters counts as "the basics" to me. | |
| ▲ | bluGill 5 days ago | parent | prev | next [-] | | that idea that packages and builds belongs to simple problem, large projects need things like more than one laguage and so end up fighting the language | | |
| ▲ | duped 5 days ago | parent [-] | | Every modern language seems to have an answer to this problem that C and C++ refuse to touch because it's out of scope for their respective committees and standards orgs | | |
| ▲ | nothrabannosir 5 days ago | parent | next [-] | | On the front page right now: Shai-Hulud malware attack: Tinycolor and over 40 NPM packages compromised (stepsecurity.io) 935 points by jamesberthoty 16 hours ago | flag | hide | 730 comments Maybe obstreperous dependency management ends up being the winning play in 2025 :) | | |
| ▲ | asa400 5 days ago | parent | next [-] | | Just think of how many _more_ vulns C and C++ could be responsible for if they had package modern managers! :) | |
| ▲ | Defletter 5 days ago | parent | prev | next [-] | | Seems like a false dichotomy | |
| ▲ | duped 4 days ago | parent | prev [-] | | Completely unrelated. |
| |
| ▲ | pclmulqdq 5 days ago | parent | prev | next [-] | | C++ has a plethora of available build and package management systems. They just aren't bundled with the compiler. IMO that is a good thing, because it keeps the compiler writers honest. | | |
| ▲ | jimbob45 5 days ago | parent [-] | | You say that as if Cargo, MSBuild, and pip aren’t massively loved by their communities. | | |
| ▲ | jcelerier 5 days ago | parent | next [-] | | Coming from c++, pip and python dependency management is the bane of my life. How do you make a python software leveraging pytorch that will ship as a single .exe and be able to target whatever gpu the user has without downloads? | | |
| ▲ | jononor 4 days ago | parent | next [-] | | Funnily enough a lot of the challenges in this particular case is related to PyTorch and CUDA being native code (mostly in C++). Of course combined with the fact that pip is not really adequate as a native/C++ code package manager. Perhaps if C++ had a decent standardized package manager, the Python package system reuse that? ;p | |
| ▲ | gpderetta 4 days ago | parent | prev [-] | | just wait for next week and python will get a better package manager! |
| |
| ▲ | pclmulqdq 5 days ago | parent | prev | next [-] | | "Massively loved" and "good decision" are orthogonal axes. See the current npm drama. People love wantonly importing dependencies the way they love drinking. Both feel great but neither is good for you. | | |
| ▲ | asa400 5 days ago | parent | next [-] | | Not that npm-style package management is the best we can do or anything, but I would be more sympathetic to this argument if C or C++ had a clearly better security story than JS, Python, etc. (pick your poison), but they're also disasters in this area. What happens in practice is people end up writing their own insecure code instead of using someone else's insecure code. Of course, we can debate the tradeoffs of one or the other! | | |
| ▲ | bluGill 4 days ago | parent [-] | | This isn't only about security. This is about interoperability, in the real world we mix (and should mix!) C, C++, Rust, python.... In the real world lawyers audit every dependency to ensure they can legally use it. In the real world we are responsible for our dependencies and so need to audit the code. |
| |
| ▲ | Defletter 5 days ago | parent | prev [-] | | I'm getting the impression that C/C++ cultists love it whenever there's an npm exploit because then they can gleefully point at it and pretend that any first-party package manager for C/C++ would inevitably result in the same, nevermind the other languages that do not have this issue, or have it to a far, far lesser extent. Do these cultists just not use dependencies? Are they just [probably inexpertly] reinventing every wheel? Or do they use system packages like that's any better *cough* AUR exploits *cought*. While dependency hell on nodejs (and even Rust if we're honest) is certainly a concern, it's npm's permissiveness and lack of auditing that's the real problem. That's why Debian is so praised. | | |
| ▲ | pclmulqdq 4 days ago | parent | next [-] | | What makes me a C++ "cultist"? I like the language, but I don't think it's a cult. And yes, they do implement their own wheel all the time (usually expertly) because libraries are reserved for functions that really need it: writing left pad is really easy. They also use third-party libraries all the time, too. They just generally pay attention to the source of that library. Google and Facebook also publish a lot of C++ libraries under one umbrella (abseil and folly respectively), and people often use one of them. | |
| ▲ | bluGill 4 days ago | parent | prev | next [-] | | STOP SAYING CULTIST! The word has very strong meaning and does not apply to anyone working with C or C++. I take offense at being called a cultist just because I say C++ is not nearly as bad as the haters keep claiming it is - as well I should. | |
| ▲ | palata 4 days ago | parent | prev [-] | | > Or do they use system packages like that's any better cough AUR exploits cought. AUR stands for "Arch User Repository". It's not the official system repository. > I'm getting the impression that C/C++ cultists love it whenever there's an npm exploit I am not a C/C++ cultist at all, and I actually don't like C++ (the language) so much (I've worked with it for years). I, for one, do not love it when there is an exploit in a language package manager. My problem with language package managers is that people love them precisely because they don't want to learn how to deal with dependencies. Which is actually the problem: if I pull a random Rust library, it will itself pull many transitive dependencies. I recently compared two implementations of the same standard (C++ vs Rust): in C++ it had 8 dependencies (I can audit that myself). In Rust... it had 260 of them. 260! I won't even read through all those names. "It's too hard to add a dependency in C++" is, in my opinion, missing the point. In C++, you have to actually deal with the dependency. You know it exists, you have seen it at least once in your life. The fact that you can't easily pull 260 dependencies you have never heard about is a feature, not a bug. I would be totally fine with great tooling like cargo, if it looked like the problem of random third-party dependencies was under control. But it is not. Not remotely. > Do these cultists just not use dependencies? I choose my dependencies carefully. If I need a couple functions from an open source dependency I don't know, I can often just pull those two functions and maintain them myself (instead of pulling the dependency and its 10 dependencies). > Are they just [probably inexpertly] reinventing every wheel? I find it ironic that when I explain that my problem is that I want to be able to audit (and maintain, if necessary) my dependencies, the answer that comes suggests that I am incompetent and "inexpertly" doing my job. Would it make me more of an expert if I was pulling, running and distributing random code from the Internet without having the smallest clue about who wrote it? Do I need to complain about how hard CMake is and compare a command line to a "magic incantation" to be considered an expert? | | |
| ▲ | Defletter 4 days ago | parent [-] | | > AUR stands for "Arch User Repository". It's not the official system repository. Okay... and? The point being made was that the issue of package managers remains: do you really think users are auditing all those "lib<slam-head-on-keyboard>" dependencies that they're forced to install? Whether they install those dependencies from the official repository or from homebrew, or nix, or AUR, or whatever, is immaterial, the developer washed their hands of this, instead leaving it to the user who in all likelihood knows significantly less than the developers to be able to make an informed decision, so they YOLO it. Third-party repositories would not exist if they had no utility. But this is why Debian is so revered: they understand this dynamic and so maintain repositories that can be trusted. Whereas the solution C/C++ cultists seem to implicitly prefer is having no repositories because dependencies are, at best, a slippery slope. > "It's too hard to add a dependency in C++" It's not hard to add a dependency. I actually prefer the dependencies-as-git-submodules approach to package managers: it's explicit and you know what you're getting and from where. But using those dependencies is a different story altogether. Don't you just love it when one or more of your dependencies has a completely different build system to the others? So now you have to start building dependencies independently, whose artefacts are in different places, etc, etc, this shouldn't be a problem. > I, for one, do not love it when there is an exploit in a language package manager. Oh please, I believe that about as much as ambulance chasers saying they don't love medical emergencies. Otherwise, why are any and all comments begging for a first-party package manager immediately swamped with strawmans about npm as if anyone is actually asking for that, instead of, say, what Zig or Go has? It's because of the cultism, and every npm exploit further entrenches it. | | |
| ▲ | pclmulqdq 4 days ago | parent | next [-] | | C++ usage has nothing to do with static/dynamic linking. One is a language and the other is a way of using libraries. Dynamic linking gives you small binaries with a lot of cross-compatibility, and static linking gives you big binaries with known function. Most production C++ out there follows the same pattern as Rust and Go and uses static linking (where do you think Rust and Go got that pattern from?). Python is a weird language that has tons of dynamic linking while also having a big package manager, which is why pip is hell to use and PyTorch is infamously hard to install. Dynamic linking shifts responsibility for the linked libraries over to the user and their OS, and if it's an Arch user using AUR they are likely very interested in assuming that risk for themselves. 99.9% of Linux users are using Debian or Ubuntu with apt for all these libs, and those maintainers do pay a lot of attention to libraries. | |
| ▲ | palata 4 days ago | parent | prev [-] | | > But this is why Debian is so revered: they understand this dynamic and so maintain repositories that can be trusted. So you do understand my point about AUR. AUR is like adding a third-party repo to your Debian configuration. So it's not a good example if you want to talk about official repositories. Debian is a good example (it's not the only distribution that has that concept), which proves my point and not yours: this is better than unchecked repositories in terms of security. > Whereas the solution C/C++ cultists seem to implicitly prefer is having no repositories because dependencies are, at best, a slippery slope. Nobody says that ever. Either you make up your cult just to win an argument, or you don't understand what C/C++ people say. The whole goddamn point is to have a trusted system repository, and if you need to pull something that is not there, then you do it properly. Which is better than pulling random stuff from random repositories, again. > I actually prefer the dependencies-as-git-submodules approach Oh right. So you do it wrong, it's good to know and it will answer your next complaint: > Don't you just love it when one or more of your dependencies has a completely different build system to the others I don't give a damn because I handle dependencies properly (not as git submodules). I don't have a single project where the dependencies all use the same build system. It's just not a problem at all, because I do it properly. What do I do then? Well exactly the same as what your system package manager does. > this shouldn't be a problem. I agree with you. Call it a footgun if you wish, you are the one pulling the trigger. It isn't a problem for me. > why are any and all comments begging for a first-party package manager immediately swamped with strawmans about npm Where did I do that? > It's because of the cultism, and every npm exploit further entrenches it. It's because npm is a good example of what happens when it goes out of control. Pip has the same problem, and Rust as well. But npm seems to be the worse, I guess because it's used by more people? | | |
| ▲ | Defletter 4 days ago | parent [-] | | Your defensiveness is completely hindering you and I cannot be bothered with that so here are some much needed clarifications: > I am not a C/C++ cultist at all, and I actually don't like C++ (the language) so much (I've worked with it for years). I, for one, do not love it when there is an exploit in a language package manager. If you do neither of those things then did it ever occur to you that this might not be about YOU? > I find it ironic that when I explain that my problem is that I want to be able to audit (and maintain, if necessary) my dependencies, the answer that comes suggests that I am incompetent and "inexpertly" doing my job. Yeah, hi, no you didn't explain that. You're probably mistaking me for someone else in some other conversation you had. The only comment of yours prior to mine in the thread is you saying "I can use pkg-config just fine." And again, you're thinking that I'm calling YOU incompetent, or even that I'm calling you incompetent. But okay, I'm sure your code never has bugs, never has memory issues, is never poorly designed or untested, that you can whip out an OpenGL alternative whatever in no time and it be just as stable and battle-tested, and to say otherwise must be calling you incompetent. That makes total sense. > AUR stands for "Arch User Repository". It's not the official system repository. > So it's not a good example if you want to talk about official repositories. I said system package, not official repository. I don't know why you keep insisting on countering an argument I did not make. Yes, system packages can be installed from unofficial repositories. I don't know how I could've made this clearer. -- Overall, getting bored of this, though the part where you harp on about doing dependencies properly compared to me and not elaborating one bit is very funny. Have a nice day. | | |
| ▲ | palata 4 days ago | parent [-] | | > Your defensiveness Start by not calling everybody disagreeing with you a cultist, next time. > I said system package, not official repository. I don't know why you keep insisting on countering an argument I did not make. Yes, system packages can be installed from unofficial repositories. I don't know how I could've made this clearer. It's not that it is unclear, it's just that it doesn't make sense. When we compare npm to a system package manager in this context, the thing we compare is whether or not is it curated. Agreed, I was maybe not using the right words (I should have said curated package managers vs not curated package managers), but it did not occur to me that it was unclear because comparing npm to a system package manager makes no sense otherwise. It's all just installing binaries somewhere on disk. AUR is much like npm in that it is not curated. So if you find that it is a security problem: great! We agree! If you want to pull something from AUR, you should read its PKGBUILD first. And if it pulls tens of packages from AUR, you should think twice before you actually install it. Just like if someone tells you to do `curl https://some_website.com/some_script.sh | sudo sh`, no matter how convenient that is. Most Linux distributions have a curated repository, which is the default for the "system package manager". Obviously, if users add custom, not curated repositories, it's a security problem. AUR is a bad example because it isn't different from npm in that regard. > though the part where you harp on about doing dependencies properly compared to me and not elaborating one bit is very funny Well I did elaborate at least one bit, but I doubt you are interested in more details than what I wrote: "What do I do then? Well exactly the same as what your system package manager does." I install the dependencies somewhere (just like the system package manager does), and I let my build system find them. It could be with CMake's `find_package`, it could be with pkg-config, whatever knows how to find packages. There is no need to install the dependencies in the place where the system package manager installs stuff: it can go anywhere you want. And you just tell CMake or pkg-config or Meson or whatever you use to look there, too. Using git submodules is just a bad idea for many reasons, including the fact that you need all of them to use the same build system (which you mentioned), or that a clean build usually implies rebuilding the dependencies (for nothing) or that it doesn't work with package managers (system or not). And usually, projects that use git submodule only support that, without offering a way to use the system package(s). | | |
| ▲ | Defletter 4 days ago | parent [-] | | > Start by not calling everybody disagreeing with you a cultist, next time. You'd do very well as a culture war pundit. Clearly I wasn't describing a particular kind of person, no, I'm clearly I'm just talking about everyone I disagree with /s | | |
| ▲ | palata 3 days ago | parent [-] | | So, not interested at all in how to deal with dependencies without git submodules, I reckon? We can stop here indeed. | | |
| ▲ | Defletter 3 days ago | parent [-] | | You misunderstand, I am already well aware. My comment about your lack of elaboration was not due to any ignorance on my part, but rather to point out how you assumed that and refused to elaborate anyway. The idea that I may have my reasons for preferring dependencies-as-git-submodules or their equivalents (like Zig's package system) never crossed your mind. Can't say I'm surprised. Oh well. | | |
| ▲ | palata 3 days ago | parent [-] | | > The idea that I may have my reasons for preferring dependencies-as-git-submodules Well, git submodules are strictly inferior and you know it: you even complained about the fact that it is a pain when some dependencies use different build systems. You choose a solution that does not work, and then you blame the tools. | | |
| ▲ | Defletter 3 days ago | parent [-] | | Okay, I'll bite: your proposed alternative to being able to specify exact versions of dependencies regardless of operating system or distro that I can statically include into a single binary, everything is project-local, guaranteed, is... what? Is it just "Don't"? | | |
| ▲ | palata 3 days ago | parent [-] | | I'm not sure what you mean. What I am saying is that using a dependency is formalised for build systems. Be it npm, cargo, gradle, meson, cmake, you name it. In cargo, you add a line to a toml file that says "please fetch this dependency, install it somewhere you understand, and then use if from this somewhere". What is convenient here is that you as a user don't need to know about those steps (how to fetch, how to install, etc). You can use Rust without Cargo and do everything manually if you need to, it's just that cargo comes with the "package manager" part included. In C/C++, the build systems don't come with the package manager included. It does not mean that there are no package managers. On the contrary, there are tons of them, and the user can choose the one they want to use. Be it the system package manager, a third-party package manager like conan or vcpkg, or doing it manually with a shell/python script. And I do mean the user, not the developer. And because the user may choose the package manager they want, the developer must not interfere otherwise it becomes a pain. Nesting dependencies into your project with git submodules is a way to interfere. As a user, I absolutely hate those projects that actually made extra work to make it hard for me to handle dependencies the way I need. How do we do that with CMake? By using find_package and/or pkg-config. In your CMakeLists.txt, you should just say `find_package(OpenSSL REQUIRED)` (or whatever it is) and let CMake find it the standard way. If `find_package` doesn't work, you can write a find module (that e.g. uses pkg-config). A valid shortcut IMO is to use pkg-config directly in CMakeLists for very small projects, but find modules are cleaner and actually reusable. CMake will search in a bunch of locations on your system. So if you want to use the system OpenSSL, you're done here, it just works. If you want to use a library that is not on the system, you still do `find_package(YourLibrary)`, but by default it won't find it (since it's not on the system). In that case, as a user, you configure the CMake project with `CMAKE_PREFIX_PATH`, saying "before you look on the system, please look into these paths I give you". So `cmake -DCMAKE_PREFIX_PATH=/path/where/you/installed/dependencies -Bbuild -S.`. And this will not only just work, but it means that your users can choose the package manager they want (again: system, third-party like conan/vcpkg, or manual)! It also means that your users can choose to use LibreSSL or BoringSSL instead of OpenSSL, because your CMakeLists does not hardcode any of that! Your CMakeLists just says "I depend on those libraries, and I need to find them in the paths that I use for the search". Whatever you do that makes CMake behave like a package manager (and I include CMake features like the FetchContent stuff) is IMO a mistake, because it won't work with dependencies that don't use CMake, and it will screw (some of) your users eventually. I talk about CMake, but the same applies for other build systems in the C/C++ world. People then tend to say "yeah I am smart, but my users are stupid and won't know how to install dependencies locally and point CMAKE_PREFIX_PATH to them". To which I answer that you can offer instructions to use a third-party package manager like conan or vcpkg, or even write helper scripts that fetch, build and install the dependencies. Just do not do that inside the CMakeLists, because it will most certainly make it painful for your users who know what they are doing. Is it simpler than what cargo or npm do? No, definitely not. Is it more flexible, totally. But it is the way it is, and it fucking works. And whoever calls themselves a C/C++ developer and cannot understand how to use the system package manager, or a conan/vcpkg and set CMAKE_PREFIX_PATH need to learn it. I won't say it's incompetence, but it's like being a C++ developer and not understanding how to use a template. It's part of the tools you must learn to use. People will spend half a day debugging a stupid mistake in their code, but somehow can't apprehend that dealing with a dependency is also part of the job. In C/C++, it's what I explained above. With npm, properly dealing with dependencies means checking the transitive dependencies and being aware of what is being pulled. The only difference is that C/C++ makes it hard to ignore it and lose control over your dependencies, whereas npm calls it a feature and people love it for that. I don't deny that CMake is not perfect, the syntax is generally weird, and writing find module is annoying. But it is not an excuse to make a mess at every single step of the process. And people who complain about CMake usually write horrible CMakeLists and could benefit from learning how to do it properly. I don't love CMake, I just don't have to complain about it everywhere I can because I can make it work, and it's not that painful. | | |
| ▲ | Defletter 2 days ago | parent [-] | | While I do appreciate you taking the time to write that, I am somewhat at a loss. How does this justify the antipathy towards notions of a first-party build system and package manager? That's how we got into this argument with each other: I was calling out C/C++ cultists who cling to the ugly patchwork of hacky tooling that is C/C++'s so-called build systems and decry any notion of a first-party build system (or even a package manager to boot) as being destined to become just like npm. C/C++ developers clearly want a build system and package manager, hence all this fragmentation, but I can't for the life of me understand why that fragmentation is preferable. For all the concern about supply-chain attacks on npm, why is it preferable that people trust random third-party package managers and their random third-party repackages of libraries (eg: SQLite on conan and vcpkg)? And why is global installation preferable? Have we learnt nothing? There's a reason why Python has venv now; why Maven and Gradle have wrappers; etc. Projects being able to build themselves to a specification without requiring the host machine to reconfigure itself to suit the needs of this one project, is a bonus, not a drawback. Devcontainers should not need to be a thing. If anything, this just reads like Sunk Cost Fallacy: that "it just works" therefore we needn't be too critical, and anyone who is or who calls for change just needs to git gud. It reminds me of the never-ending war over memory safety: use third-party tools if you must but otherwise just git gud. It's this kind of mindset that has people believing that C/C++'s so-called build systems are just adhering to "there should be some artificial friction when using dependencies to discourage over-use of dependencies", instead of being a Jenga tower of random tools with nothing but gravity holding it all together. If it were up to me, C/C++ would get a more fleshed-out version of Zig's build system and package manager, ie, something unified, simple, with no central repository, project-local, exact, and explicit. You want SQLite? Just refer to SQLite git repository at a specific commit and the build system will sort it out for you. Granted, it doesn't have an official build.zig so you'll need to write your own, or trust a premade one... but that would also be true if you installed SQLite through conan of vcpkg. | | |
| ▲ | palata 2 days ago | parent [-] | | > How does this justify the antipathy towards notions of a first-party build system and package manager? I don't feel particularly antipathic towards notions of first-party build system and package manager. I find it indeniably better to have a first-party build system instead of the fragmentation that exists in C/C++. On the other hand, I don't feel like asking a 20-year old project to leave autotools just because I asked for it. Or to force people to install Python because I think Meson is cool. As for the package manager, one issue is security: is it (even partly) curated or not? I could imagine npm offering a curated repo, and a non-curated repo. But there is also a cultural thing there: it is considered normal to have zero control over the dependencies (my this I mean that if the developer has not heard of dependencies they are pulling, then it's not under control). Admittedly it is not a tooling problem, it's a culture problem. Though the tooling allows this culture to be the norm. When I add a C/C++ dependency to my project, I do my shopping: I go check the projects, I check how mature they are, I look into the codebase, I check who has control over it. Sometimes I will depend on the project, sometimes I will choose to fork it in order to have more control. And of course, if I can get it from the curated list offered by my distro, that's even better. > C/C++ developers clearly want a build system and package manager, hence all this fragmentation One thing is legacy: it did not exist before, many tools were created, and now they exist. The fact that the ecosystem had the flexibility to test different things (which surely influenced the modern languages) is great. In a way, having a first-party tool makes it harder to get that. And then there are examples like Swift where is slowly converged towards SwiftPM. But at the time CocoaPods and Carthage were invented, SwiftPM was not a thing. Also devs want a build system and package manager, but they don't necessarily all want the same one :-). I don't use third-party package managers for instance, instead I build my dependencies manually. Which I find gives me more control, also for cross-compiling. Sometimes I have specific requirements, e.g. when building a Linux distribution (think e.g. Yocto or buildroot). And I don't usually want to depend on Python just for the sake of it, and Conan is a Python tool. > why is it preferable that people trust random third-party package managers and their random third-party repackages of libraries (eg: SQLite on conan and vcpkg)? It's not. Trusting a third-party package manager is actually exactly the same as trusting npm. It's more convenient, but less secure. However it's better when you can rely on a curated repository (like what Linux distributions generally provide). Not everything can be curated, but there is a core. Think OpenSSL for instance. > And why is global installation preferable? For those dependencies that can be curated, there is a question of security. If all your programs on your system link the same system OpenSSL, then it's super easy to update this OpenSSL when there is a security issue. And in situations where what you ship is a Linux system, then there is no point in not doing it. So there are situations where it is preferable. If everything is statically link and you have a critical fix for a common library, you need to rebuild everything. > If it were up to me Sure, if we were to rebuild everything from scratch... well we wouldn't do it in C/C++ in the first place, I'm pretty sure. But my Linux distribution exists, has a lot of merits, and I don't find it very nice when people try to enforce their preferences. I am fine if people want to use Flatpak, cargo, pip, nix, their system package manager, something else, or a mix of all that. But I like being able to install packages on my Gentoo system the way I like, potentially modifying them with a user patch. I like being able to choose if I want to link statically or dynamically (on my Linux, I like to link at least some libraries like OpenSSL dynamically, if I build an Android apk, I like to statically link the dependencies). And I feel like I am not forcing anyone into doing what I like to do. I actually think that most people should not use Gentoo. I don't prevent anyone from using Flatpak or pulling half the Internet with docker containers for everything. But if they come telling me that my way is crap, I will defend it :-). > I am somewhat at a loss. I guess I was not trying to say "C/C++ is great, there is nothing to change". I just think it's not all crap, and I see where it all comes from and why we can't just throw everything away. There are many things to criticise, but many times I feel like criticisms are uninformed and just relying on the fact that everybody does that. Everybody spits on CMake, so it's easy to do it as well. But more often than not, if I start talking to someone who said that they cannot imagine how someone could design something as bad as CMake, they themselves write terrible CMakeLists. Those who can actually use CMake are generally a lot more nuanced. | | |
| ▲ | Defletter 6 hours ago | parent [-] | | Even though I understand why you prefer that, I feel like you're painting too rosy of an image. To quote Tom Delalande: "There are some projects where if it was 10% harder to write the code, the project would fail." I believe this deeply and that this is also true for the build system: your build config should not be rivalling your source code in terms of length. That's hyperbole in most cases, sure, and may well indicate badly written build configs, but writing build configs should not be a skill issue. I am willing to bet that Rust has risen so much in popularity not just because of its memory safety, but also because of its build system. I don't like CMake, but I also don't envy its position. |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
| ▲ | palata 5 days ago | parent | prev | next [-] | | They are massively loved because people don't want to learn how it works. But the result is that people massively don't understand how package management works, and miss the real cost of dependencies. | |
| ▲ | pjmlp 4 days ago | parent | prev [-] | | MSBuild also does C++. |
|
| |
| ▲ | palata 5 days ago | parent | prev | next [-] | | Modern languages don't generally play nice with linux distributions, IMO. C and C++ have an answer to the dependency problem, you just have to learn how to do it. It's not rocket science, but you have to learn something. Modern languages remove this barrier, so that people who don't want to learn can still produce stuff. Good for them. | |
| ▲ | bluGill 4 days ago | parent | prev [-] | | no they don't - at least not a good answer. It generally amounts to running a different build system and waiting - this destroys parralism and slows the build down. |
|
| |
| ▲ | jchw 5 days ago | parent | prev [-] | | I'm not going to defend the fact that the C++ devex sucks. There are really a lot of reasons for it, some of which can't sensibly be blamed on the language and some of which absolutely can be. (Most of it probably just comes down to the language and tooling being really old and not having changed in some specific fundamental ways.) However, it's definitely wrong to say that the typical tools are "non-portable". The UNIX-style C++ toolchains work basically anywhere, including Windows, although I admit some of the tools require MSys/Cygwin. You can definitely use GNU Makefiles with pkg-config using MSys2 and have a fine experience. Needless to say, this also works on Linux, macOS, FreeBSD, Solaris, etc. More modern tooling like CMake and Ninja work perfectly fine on Windows and don't need any special environment like Cygwin or MSys, can use your MSVC installation just fine. I don't really think applying the mantra of Rust package management and build processes to C++ is a good idea. C++'s toolchain is amenable to many things that Rust and Cargo aren't. Instead, it'd be better to talk about why C++ sucks to use, and then try to figure out what steps could be taken to make it suck less. Like: - Building C++ software is hard. There's no canonical build system, and many build systems are arcane. This one really might be a tough nut to crack. The issue is that creating yet another system is bound to just cause xkcd 927. As it is, there are many popular ways to build, including GNU Make, GNU Autotools + Make, Meson, CMake, Visual Studio Solutions, etc. CMake is the most obvious winner right now. It has achieved defacto standard support. It works on basically any operating system, and IDEs like CLion and Visual Studio 2022 have robust support for CMake projects. Most importantly, building with CMake couldn't be much simpler. It looks like this: $ cmake -B .build -S .
...
$ cmake --build .build
...
And you have a build in .build. I think this is acceptable. (A one-step build would be simpler, but this is definitely more flexible, I think it is very passable.)This does require learning CMake, and CMake lists files are definitely a bit ugly and sometimes confusing. Still, they are pretty practical, and rather easy to get started with, so I think it's a clear win. CMake is the "defacto" way to go here. - Managing dependencies in C++ is hard. Sometimes you want external dependencies, sometimes you want vendored dependencies. This problem's even worse. CMake helps a little here, because it has really robust mechanisms for finding external dependencies. However, while robust, the mechanism is definitely a bit arcane; it has two modes, the legacy Find scripts mode, and the newer Config mode, and some things like version constraints can have strange and surprising behavior (it differs on a lot of factors!) But sometimes you don't want to use external dependencies, like on Windows, where it just doesn't make sense. What can do you really do here? I think the most obvious thing to do is use vcpkg. As the name implies, it's Microsoft's solution to source-level dependencies. Using vcpkg with Visual Studio and CMake is relatively easy, and it can be configured with a couple of JSON files (and there is a simple CLI that you can use to add/remove dependencies, etc.) When you configure your CMake build, your dependencies will be fetched and built appropriately for your targets, and then CMake's find package mechanism can be used just as it is used for external dependencies. CMake itself is also capable of vendoring projects within itself, and it's absolutely possible to support all three modalities of manual vendoring, vcpkg, and external dependencies. However, for obvious reasons this is generally not advisable. It's really complicated to write CMake scripts that actually work properly in every possible case, and many cases need to be prevented because they won't actually work. All of that considered, I think the best existing solution here is CMake + vcpkg. When using external dependencies is desired, simply not using vcpkg is sufficient and the external dependencies will be picked up as long as they are installed. This gives an experience much closer to what you'd expect from a modern toolchain, but without limiting you from using external dependencies which is often unavoidable in C++ (especially on Linux.) - Cross-compiling with C++ is hard. In my opinion this is mostly not solved by the "defacto" toolchains. :) It absolutely is possible to solve this. Clang is already better off than most of the other C++ toolchains in that it can handle cross-compiling with selecting cross-compile targets at runtime rather than build time. This avoids the issue in GCC where you need a toolchain built for each target triplet you wish to target, but you still run into the issue of needing libc/etc. for each target. Both CMake and vcpkg technically do support cross-compilation to some extent, but I think it rarely works without some hacking around in practice, in contrast to something like Go. If cross-compiling is a priority, the Zig toolchain offers a solution for C/C++ projects that includes both effortless cross-compiling as well as an easy to use build command. It is probably the closest to solving every (toolchain) problem C++ has, at least in theory. However, I think it doesn't really offer much for C/C++ dependencies yet. There were plans to integrate vcpkg for this I think, but I don't know where they went. If Zig integrates vcpkg deeply, I think it would become the obvious choice for modern C++ projects. I get that by not having a "standard" solution, C++ remains somewhat of a nightmare for people to get started in, and I've generally been doing very little C++ lately because of this. However I've found that there is actually a reasonable happy path in modern C++ development, and I'd definitely recommend beginners to go down that path if they want to use C++. | | |
| ▲ | palata 4 days ago | parent [-] | | > Using vcpkg [...] When you configure your CMake build, your dependencies will be fetched and built appropriately for your targets, and then CMake's find package mechanism can be used just as it is used for external dependencies. Yes! I believe this is powerful: if CMake is used properly, it does not have to know where the dependencies come from, it will just "find" them. So they could be installed on the system, or fetched by a package manager like vcpkg or conan, or just built and installed manually somewhere. > Cross-compiling with C++ is hard. Just wanted to mention the dockcross project here. I find it very useful (you just build in a docker container that has the toolchain setup for cross-compilation) and it "just works". |
|
|
| |
| ▲ | jandrese 4 days ago | parent | prev | next [-] | | You also don't do "rustc build". Cargo is a build system too. The whole point of pkg-config is to tell the compiler where those packages are. I mean yeah, that's the point of having a tool like that. It's fine that the compiler doesn't know that, because its job is turning source into executables, not being the OS glue. I'm not sure "having a linker" is a weakness? What are talking about? It is true that you need to use the package manager to install the dependencies. This is more effort than having a package manager download them for you automatically, but on the other hand you don't end up in a situation where you need virtual environments for every application because they've all downloaded slightly different versions of the same packages. It's a bit of a philosophical argument as to what is the better solution. The argument that it is too hard for students seems a bit overblown. The instructions for getting this up and running are: 1. apt install build-essential
2. extract the example files (Makefile and c file), cd into the directory
3. type "make"
4. run your program with ./programname
I'd argue that is fewer steps than setting up almost any IDE. The Makefile is 6 lines and is easy to adapt to any similar size project. The only major weakness is headers, in which case you can do something like: HEADERS=headerA.h headerB.h headerC.h
file1.o: $(HEADERS)
file2.o: $(HEADERS)
file3.o: $(HEADERS)
If you change any header it will trigger a full system rebuild, but on C projects this is fine for a long time. It's just annoying that you have to create a new entry for every c file you add to the project instead of being able to tell make to add that to every object automatically. I suspect there is a very arcane way to do this, but I try to keep it as simple as possible. | | |
| ▲ | steveklabnik 4 days ago | parent [-] | | I'm not your parent, but the overall point of this kind of thing is that all of these individual steps are more annoying and error-prone than one command that just takes care of it. `cargo build` is all you need to build the vast majority of Rust projects. No need to edit the Makefile for those headers, or remember which commands you need to install the various dependencies, and name them individually, figuring out which name maps to your distro's naming scheme, etc. It's not just "one command vs five" it's "one command for every project vs five commands that differ slightly per project and per platform". `make` can come close to this, and it's why people love `./configure; make`, and there's no inherent reason why this couldn't paper over some more differences to make it near universal, but that still only gets you Unix platforms. > but on the other hand you don't end up in a situation where you need virtual environments for every application because they've all downloaded slightly different versions of the same packages. The real downside here is that if you need two different programs with two different versions of packages, you're stuck. This is often mitigated by things like foo vs foo2, but I have been in a situation where two projects both rely on different versions of foo2, and cannot be unified. The per-project dependency strategy handles this with ease, the global strategy cannot. |
| |
| ▲ | benreesman 5 days ago | parent | prev | next [-] | | clang++ $(pkg-config --cflags --libs libtorch) qwen-3-nvfp4.cpp -o ./qwen-3-infer Your move. | | | |
| ▲ | worik 5 days ago | parent | prev [-] | | None of that is a problem There are a lot of problems, but having to carefully construct the build environment is a minor one time hassle. Then repeated foot guns going off, no toes left, company bankrupt and banking system crashed, again | | |
| ▲ | vlovich123 5 days ago | parent | next [-] | | > There are a lot of problems, but having to carefully construct the build environment is a minor one time hassle. I've observed the existence in larger projects of "build engineers" whose sole job is to keep the project building on a regular cadence. These jobs predominantly seem to exist in C++ land. | | |
| ▲ | TuxSH 5 days ago | parent | next [-] | | > These jobs predominantly seem to exist in C++ land. You wish. These jobs exist for companies with large monorepos in other languages too and/or when you have many projects. Plenty of stuff to handle in big companies (directory ownership, Jenkins setup, in-company dependency management and release versioning, developer experience in genernal, etc.) | |
| ▲ | josefx 5 days ago | parent | prev | next [-] | | Most of what I have seen came from technical debt aquired over decades. With some of the build engineers hired to "manage" that themselves not being treated as programmers and just adding on top of the mess with "fixes" that are never reviewed or even checked in. Had a fun time once after we reinstalled the build server and found out that the last build engineer created a local folder to store various dependencies instead of of using vcpkg to fetch everything as we had mandated for several years by then. | |
| ▲ | pjmlp 5 days ago | parent | prev | next [-] | | I have been "build engineer" across many projects, regardless of the set of programming languages being used, this is not specific to C++. | |
| ▲ | jandrewrogers 5 days ago | parent | prev | next [-] | | I’ve only ever seen this on extraordinarily complex codebases that mixed several languages. Pure C++ scales really well these days. | |
| ▲ | wojciii 5 days ago | parent | prev [-] | | How is that even possible? Wasn't CI invented to solve just this problem? | | |
| ▲ | vlovich123 5 days ago | parent [-] | | You have a team of 20 engineers on a project you want to maintain velocity on. With that many coooks, you have patches on top of patches of your build system where everyone does the bare minimum to meet the near term task only and it devolves into a mess no one wants to touch over enough time. Your choice: do you have the most senior engineers spend time sporadically maintaining the build system, perhaps declaring fires to try to pay off tech debt, or hire someone full time, perhaps cheaper and with better expertise, dedicated to the task instead? CI is an orthogonal problem but that too requires maintenance - do you maintain it ad-hoc or make it the official responsibility for someone to keep maintained and flexible for the team’s needs? I think you think I’m saying the task is keeping the build green whereas I’m saying someone has to keep the system that’s keeping the build green going and functional. | | |
| ▲ | AdieuToLogic 5 days ago | parent | next [-] | | > You have a team of 20 engineers on a project you want to maintain velocity on. With that many coooks, you have patches on top of patches of your build system ... The scenario you are describing does not make sense for the commonly accepted industry definition of "build system." It would make sense if, instead, the description was "application", "product", or "system." Many software engineers use and interpret the phrase "build system" to be something akin to make[0] or similar solution used to produce executable artifacts from source code assets. 0 - https://man.freebsd.org/cgi/man.cgi?query=make&apropos=0&sek... | | |
| ▲ | vlovich123 5 days ago | parent [-] | | I can only relate to you what I’ve observed. Engineers were hired to rewrite the Make-based system into Bazel and maintain it for single executable distributed to the edge. I’ve also observed this for embedded applications and other stuff. I’m not sure why you’re dismissing it as something else without knowing any of the details or presuming I don’t know what I’m talking about. | | |
| ▲ | AdieuToLogic 3 days ago | parent [-] | | >>> You have a team of 20 engineers on a project you want to maintain velocity on. With that many coooks, you have patches on top of patches of your build system ... >> The scenario you are describing does not make sense for the commonly accepted industry definition of "build system." > I’m not sure why you’re dismissing it as something else without knowing any of the details or presuming I don’t know what I’m talking about. My apologies for what I wrote giving the impression of being dismissive or implying an assessment of your knowledge. This was not my intent and instead was my expression of incredulity for a build definition requiring 20 engineers to maintain. Perhaps I misinterpreted the "cooks" responsible for build definition maintenance as being all of those 20 engineers. If so, I hope you can see how someone not involved in your project could reach this conclusion based on the quote above. Still and all, if this[0] is the Bazel build tool you reference and its use is such that: With that many coooks[sic], you have patches on top of patches
of your build system where everyone does the bare minimum
to meet the near term task only and it devolves into a mess
no one wants to touch over enough time.
Then the questions I would ask of the project members/stakeholders are: 1 - Does using Bazel reduce build definition
maintenance verses other build tools such as
Make/CMake/etc.?
2 - Does the engineering team value reproducible build
definitions as much as production and test source
artifacts?
3 - If not, why not?
EDIT:To clarify the rationale behind questions #2 and #3: Build definitions are production code, because if the system cannot be built, then it cannot be released. Test suites are production code, because if tests fail, then the build should fail and the system cannot be released. 0 - https://bazel.build/start/cpp |
|
| |
| ▲ | wojciii 2 days ago | parent | prev [-] | | I worked in companies that did this .. 20 years ago. I didn't imagine that this was still possible. For me it's just about rules/discipline: Commit working code with passing unottests. Everyone is responsible for fixing stuff. You break something you'll fix it. |
|
|
| |
| ▲ | juliangmp 5 days ago | parent | prev [-] | | > minor one time hassle I dont know if you're jokingor just naïve, but cmake and the like are massive time sinks if you want anything beyond "here's a few source files, make me an application" |
|
|
|