| ▲ | goranmoomin 3 hours ago | |
The impact on context tokens would be more of a 'you're holding it wrong' problem, no? The GH MCP burning tokens is an issue on the GH MCP server, not the protocol itself. (I would say that since the gh CLI would be strongly represented in the training dataset, it would be more beneficial to just use the CLI in this case though.) I do think that we should adopt Amp's MCPs-on-skills model that I've mentioned in my original comment more (hence allowing on-demand context management). | ||