| ▲ | Quarrelsome 3 hours ago | |||||||
> But it's not clear where does this original structure come from. Why are its own authors not aware about the (arguably, quite simple) concept of normalization? I find the bafflement expressed in the article as well as the one linked extremely attractive. It made both a joy to read. Were I to hazard a guess: Might it be a consequence of lack of disk space in those early decades, resulting into developers being cautious about defining new tables and failing to rationalise that the duplication in their tragic designs would result in more space wasted? > The other side of this coin is that lots of real-world design have a lot of denormalized representations that are often reasonably-well engineered. Agreed, but as the OP comment stated they usually started out normalised and then pushed out denormalised representations for nice contiguous reads. As a victim of maintaining a stack on top of an EAV schema once upon a time, I have great appreciation for contiguous reads. | ||||||||
| ▲ | petalmind 3 hours ago | parent [-] | |||||||
> Might it be a consequence of lack of disk space in those early decades A plausible explanation of "normalization as a process" was actually found in https://www.cargocultcode.com/normalization-is-not-a-process... ("So where did it begin?"). I hope someday to find some technical report of migrating to the relational database, from around that time. | ||||||||
| ||||||||