▲ | suladead 4 days ago | |||||||
I built pretty much this exact rig myself, but now it's gathering dust, any other uses for this rather than localLLMS | ||||||||
▲ | Tepix 4 days ago | parent | next [-] | |||||||
Sell it? There are people who want a rig like this. | ||||||||
▲ | dotnet00 4 days ago | parent | prev | next [-] | |||||||
The 3090 I have in my server (Ollama on it is only used occasionally nowadays since I have dual 5080s on my work desktop), also handles accelerating transcoding in Plex, and is in the process of being setup to handle monitoring my 3d printers for failures via camera. Am also considering setting up Home Assistant with LLM support again. | ||||||||
▲ | asimovDev 4 days ago | parent | prev | next [-] | |||||||
Play DnD by yourself with Llama as a DM | ||||||||
▲ | robotswantdata 4 days ago | parent | prev | next [-] | |||||||
Heating | ||||||||
| ||||||||
▲ | DaSHacka 4 days ago | parent | prev | next [-] | |||||||
vidya | ||||||||
▲ | winkelmann 4 days ago | parent | prev | next [-] | |||||||
3D rendering and fluid simulation stuff could be interesting. | ||||||||
▲ | thiago_fm 4 days ago | parent | prev [-] | |||||||
Playing games, it has a good graphics card |