▲ | doug_durham 12 days ago | |
The biggest contribution is the LLM compatible metadata that describes the tool and its argument. It is trivial to adopt. In python you can use FASTMcp to add a decorator to a function, and as long as that function returns a JSON string you are in business. The decorator extracts the arguments and doc strings and presents that to the LLM. | ||
▲ | jillesvangurp 12 days ago | parent [-] | |
What makes a spec LLM compatible? I've thrown a lot of different things at gpt o1 and it generally understands them more better than I do. OpenAI specifications, unstructured text, log output, etc. |