▲ | FireInsight 5 days ago | |
Oh, one of the worst forms of torture is definitely trying to get a random Python AI project from GitHub running locally. There's almost always a conflict between versions Python, Cuda, Pytorch, and a hodgepodge of pip and conda packages. Publishing a requirements.txt is the bare miminum everybody usually does, but that's usually not enough to reconstruct the environment. The ecosystem should just standardize to using declaratively prebuilt container environments or something. Granted, my experience is mostly from the GPT-2 era, so I'm not sure if it's still this painful. | ||
▲ | phatskat 4 days ago | parent [-] | |
Don’t know if this would help your case or not, but jart’s llamafile seems like it would be useful |