| ▲ | deathanatos a day ago | ||||||||||||||||||||||
I think even having references that aren't necessarily null is only part of it. Image that your language supports two forms of references, one nullable, one not. Let's just borrow C++ here:
The latter is still a bad idea, even if it isn't the only reference form, and even if it isn't the default, if it lets you do this:
Where only things that are, e.g., type T have a `thing` attribute. Nulls are "obviously" not T, but a good number of languages' type system which permit nullable reference, or some form of it, permit treating what is in actuality T|null in the type system as if it were just T, usually leading to some form of runtime failure if null is actually used, ranging from UB (in C, C++) to panics/exceptions (Go, Java, C#, TS).It's an error that can be caught by the type system (any number of other languages demonstrate that), and null pointer derefs are one of those bugs that just plague the languages that have it. | |||||||||||||||||||||||
| ▲ | bazoom42 10 hours ago | parent [-] | ||||||||||||||||||||||
TypeScript actually supports nulls through type unions, exactly as Hoare suggests. It will not let you derefence a possibly-null value without a check. C# also supports null-safety, although less elegantly and as opt-in. If enabled, it won’t let you deference a possibly-null reference. | |||||||||||||||||||||||
| |||||||||||||||||||||||