| ▲ | lynndotpy 3 days ago | |||||||
If you're seriously doing deep learning research, it's very very nice to own your own GPU. For four years of AI PhD research I worked with a 1050Ti on a personal laptop and a 2060 on a personal desktop. You can do a lot of validation and development on consumer GPUs. That said, the OP does not train an LLM from scratch on a 3090. That would not be feasible | ||||||||
| ▲ | joefourier 3 days ago | parent | next [-] | |||||||
M? The OP literally did train an LLM from scratch in a 3090 (except for the tokenizer), that’s what the whole post is about. | ||||||||
| ||||||||
| ▲ | deskamess 3 days ago | parent | prev [-] | |||||||
I have an old 2060 with 6GB (I think). I also have a work laptop 3060 with 6GB (shared to 8GB). What can I do with those? I dabble a bit here and there but I would like to run my own local LLM for 'fun'. Thanks! | ||||||||
| ||||||||