| ▲ | jjcm an hour ago | |
I did this: https://image.non.io/2093de83-97f6-43e1-a95e-3667b6d89b3f.we... Literally just downloaded the model into a folder, opened cursor in that folder, and told it to get it running. Prompt: The gguf for bonsai 8b are in this local project. Get it up and running so I can chat with it. I don't care through what interface. Just get things going quickly. Run it locally - I have plenty of vram. https://huggingface.co/prism-ml/Bonsai-8B-gguf/tree/main I had to ask it to increase the context window size to 64k, but other than that it got it running just fine. After that I just told ngrok the port I was serving it on and voila. | ||