| ▲ | vessenes 7 hours ago | |
Salvatore - this is cool. I am a fan of using Steve Yegge's beads for this - it generally cuts the markdown file cruft significantly. Did you run any benchmarking? I'm curious if python's stack is faster or slower than a pure C vibe coded inference tool. | ||
| ▲ | samtheprogram 2 hours ago | parent [-] | |
There’s benchmarks in the README. Python is ~10x faster. It’s heavily optimized. Based on the numbers and my experience with Flux.1, I’m guessing the Python run is JIT’d (or Flux.2 is faster), although it’d likely only be ~half as fast if it weren’t (i.e. definitely not 10x slower). | ||