| ▲ | janalsncm 8 months ago | |
You mentioned it took 100 gpu hours, what gpu did you train on? | ||
| ▲ | ollin 8 months ago | parent [-] | |
Mostly 1xA10 (though I switched to 1xGH200 briefly at the end, lambda has a sale going). The network used in the post is very tiny, but I had to train a really long time w/ large batch to get somewhat-stable results. | ||