| ▲ | csmantle a day ago | |
"AI" comes in various flavors. It could be a expert system, a decision forest, a CNN, a Transformer, etc. In most inference scenarios the model is fixed, the input/output shapes are pre-defined and actions are prescribed. So it's not that dynamic after all. | ||
| ▲ | vlovich123 21 hours ago | parent [-] | |
This is also true of LLMs. I’m really not sure of OP’s point - AI (really all ML) generally is like the canonical “trivial to preallocate” problem. Where dynamic allocation starts to be really helpful is if you want to minimize your peak RAM usage for coexistence purposes (eg you have other processes running) or want to undersize your physical RAM requirements by leveraging temporal differences between different parts of code (ie components A and B never use memory simultaneously so either A or B can reuse the same RAM). It also does simplify some algorithms and also if you’re ever dealing with variable length inputs then it can help you not have to reason about maximums at design time (provided you just correctly handle an allocations failure). | ||