▲ | wfn 2 days ago | |
> but then the shell commands were actually running llama.cpp, a mistake probably no human would make. But in the docs I see things like
Wouldn't this explain that? (Didn't look too deep) | ||
▲ | danielhanchen 2 days ago | parent [-] | |
Yes it's probs the ordering od the docs thats the issue :) Ie https://docs.unsloth.ai/basics/deepseek-v3.1#run-in-llama.cp... does: ``` apt-get update apt-get install pciutils build-essential cmake curl libcurl4-openssl-dev -y git clone https://github.com/ggerganov/llama.cpp cmake llama.cpp -B llama.cpp/build \ -DBUILD_SHARED_LIBS=OFF -DGGML_CUDA=ON -DLLAMA_CURL=ON cmake --build llama.cpp/build --config Release -j --clean-first --target llama-quantize llama-cli llama-gguf-split llama-mtmd-cli llama-server cp llama.cpp/build/bin/llama-* llama.cpp ``` but then Ollama is above it: ``` ./llama.cpp/llama-gguf-split --merge \ DeepSeek-V3.1-GGUF/DeepSeek-V3.1-UD-Q2_K_XL/DeepSeek-V3.1-UD-Q2_K_XL-00001-of-00006.gguf \ merged_file.gguf ``` I'll edit the area to say you first have to install llama.cpp |