| ▲ | zozbot234 3 days ago | |
> Was there some relatively recent fundamental breakthrough or other change that prevented a Fil-C-like approach from being viable before? The provenance model for C is very recent (and still a TS, not part of the standard). Prior to that, there was a vague notion that the C abstract machine has quasi-segmented memory (you aren't really allowed to do arithmetic on a pointer to an "object" to reach a different "object") but this was not clearly stated in usable terms. | ||
| ▲ | actionfromafar 3 days ago | parent | next [-] | |
Also in practical terms, you have a lot more address space to "waste" in 64 bit. It would have been frivolous in 32 and downright offending in 16 bit code. | ||
| ▲ | uecker 2 days ago | parent | prev [-] | |
The memory model always had segmented memory in mind and safe C approaches are not new. The provenance model makes this more precise, but the need for this was to deal with corner cases such as pointer-to-integer roundtrips or access to the representation bytes of a pointer. Of course, neither GCC nor clang get this right, to the extend that those compiler are internally inconsistent and miscompile even code that did not any clarification to be considered correct. | ||