Remix.run Logo
aw1621107 4 days ago

> I don't know why the choice to use the compiler to implement parsing a C header is framed as a bad thing.

Not sure I agree with this interpretation, though maybe I'm focusing on a different part of the article than you are. Where you are getting the negative sense from?

That being said, I don't think it's too hard to imagine why someone might be a bit hesitant to use a C/C++ compiler to parse C/C++ headers - for example, it can be a pretty big dependency to take on, may add friction for devs/users, and integration with your own tool may be awkward and/or an ongoing time sink especially if you're crossing an FFI boundary or if the API you're using isn't stable (as I believe it is the case for LLVM).

> There is a reason it is called intMAX_t! It does not HAVE a definite size, it is the MAXimal size of an integer on that platform.

I think this somewhat misses the point of the bit you quoted. In context, it's basically saying that grabbing "real" C type info for interop is so painful that people will hard-code "reasonable" assumptions instead.

> When you want your program to have a stable ABI, that doesn't change when your platform supports larger integer types, you just don't use intMAX_t!

My impression is that the problem is less intmax_t changing and more that intmax_t can change out of sync. Even if you assume every use of intmax_t in a public API corresponds to an intentional desire for the bit width to evolve over time, you can still run into nasty issues if you can't recompile everything at once (which is a pretty strong constraint on the C/C++ committees if history is any indication).