Remix.run Logo
tetha 4 days ago

It does touch on what I was thinking as well at the end of the first section: Usually this makes sense if your application has to manage a lot of complexity, or rather, has to consume and produce the same domain objects in many different ways across many different APIs.

For example, some systems interact with several different vendor, tracking and payment systems that are all kinda the same, but also kinda different. Here it makes sense to have an internal domain model and to normalize all of these other systems into your domain model at a very early level. Otherwise complexity rises very, very quickly due to the number of n things interacting with n other things.

On the other hand, for a lot of our smaller and simpler systems that output JSON based of a database for other systems... it's a realistic question if maintaining the domain model and API translation for every endpoint in every change is actually less work than ripping out the API modelling framework, which occurs once every few years, if at all? Some teams would probably rewrite from scratch with new knowledge, especially if they have API-tests available.

AlphaSite 3 days ago | parent [-]

I’d say where it’s more Important is when you need to manage database performance. This lets you design an api that’s pleasant for users, well normalised internally, while also performing well.

Usually normalisation and performance lead to a poor api that’s hard for users to use and hard hard to evolve since you’re so tightly coupled to your external representation.