| ▲ | ares623 2 hours ago | |||||||
the obsession on token discounts recently is pretty funny. If you extrapolate far enough you end up back to where we started, programming languages. | ||||||||
| ▲ | thegagne 2 hours ago | parent | next [-] | |||||||
Hah, I think about this all the time. I think we subtly desire LLMs to be more and more deterministic and efficient. This is why one of the main uses of LLMs is building tools to make their job easier. I made my own project, with one of the goals being discounting tokens, but found that the real goal was just ensuring quality and making things more programmatic. Basically ends up being agents.md in a schema driven yaml file. Thinking about extending it to also generate or replace skill.md. I think the proliferation of markdown is cool, and lowers the barrier for entry, but it’s also very unpredictable and loose. I think over time we will drive these to be more like config files instead of free text. | ||||||||
| ▲ | xandrius 2 hours ago | parent | prev | next [-] | |||||||
Yeah, I wonder what kind of work people do that they need more than 500k or 1M context window. Even when it's a big project, breaking it down doesn't change the output quality. | ||||||||
| ▲ | nunodonato 2 hours ago | parent | prev | next [-] | |||||||
it's crazy, I have seen so many projects popping up just focusing on reducing token usage. At least caveman speak is funny! Have to say that since we switched to our own model in a rented GPU, we stopped worrying about tokens and just use the hell out of our AI as much as we want :) | ||||||||
| ▲ | nunodonato 2 hours ago | parent | prev [-] | |||||||
remember TOON? it killed JSON /s just in case | ||||||||
| ||||||||