| ▲ | brutus1213 4 days ago | |||||||||||||
I recently got a 5090 with 64 GB of RAM (intel cpu). Was just looking for a strong model I can host locally. If I had performance of GPT4-o, I'd be content. Are there any suggestions or cases where people got disappointed? | ||||||||||||||
| ▲ | bogtog 4 days ago | parent | next [-] | |||||||||||||
GPT-OSS-20B at 4- or 8-bits is probably your best bet? Qwen3-30b-a3b probably the next best option. Maybe there exists some 1.7 or 2 bit version of GPT-OSS-120B | ||||||||||||||
| ▲ | p1esk 4 days ago | parent | prev [-] | |||||||||||||
5090 has 32GB of RAM. Not sure if that’s enough to fit this model. | ||||||||||||||
| ||||||||||||||