| ▲ | khimaros 2 hours ago | |||||||
allocation is irrelevant. as an owner of one of these you can absolutely use the full 128GB (minus OS overhead) for inference workloads | ||||||||
| ▲ | EasyMark an hour ago | parent [-] | |||||||
Care to go into a bit more on machine specs? I am interested in picking up a rig to do some LLM stuff and not sure where to get started. I also just need a new machine, mine is 8y-o (with some gaming gpu upgrades) at this point and It's That Time Again. No biggie tho, just curious what a good modern machine might look like. | ||||||||
| ||||||||