| ▲ | sc68cal 10 hours ago |
| So, they implemented a git client in zig, that had some significant speedups for their usecase. However: > The git CLI test suite consists of 21,329 individual assertions for various git subcommands (that way we can be certain ziggit does suffice as a drop-in replacement for git). <snip> > While we only got through part of the overall test suite, that's still the equivalent of a month's worth of straight developer work (again, without sleep or eating factored in). |
|
| ▲ | rao-v 10 hours ago | parent | next [-] |
| Wait, so they don’t have test parity with git? How do they know that they, umm … did the actual thing they were trying to do? |
| |
| ▲ | didgetmaster 10 hours ago | parent | next [-] | | I have heard that you can speed up your favorite compression algorithm by 1000x, if you are not so concerned about what happens when you try to decompress it. | | |
| ▲ | magackame 10 hours ago | parent | next [-] | | Also gotta love the write-only disk as a hardware analogy. Insane write speeds and infinite capacity... | |
| ▲ | delusional 7 hours ago | parent | prev [-] | | It's just a lossy compression scheme. |
| |
| ▲ | yevbar 10 hours ago | parent | prev | next [-] | | I ran the test suite specifically for git's CLI as that was the target I wanted to build towards (Anthropic's C compiler failed to make an operating system since that was never in their original prompts/goals) The way it gets organized is there are "scripts" which encompass different commands (status, diff, commit, etc) however each of these scripts themselves contain several hundred distinct assertions covering flags and arguments. The test suite was my way of validating I not only had a feature implemented but also "valid" by git's standards | |
| ▲ | djoldman 9 hours ago | parent | prev [-] | | This is the same as all the folks asking for and hawking quantized models. It doesn't matter if the parent model is GPT GOD mode mythos opus 100x Ultra. What matters is the performance of the quantized model. |
|
|
| ▲ | varispeed 10 hours ago | parent | prev | next [-] |
| Reminds me when I was happy with my algorithm being super fast until I started tackling edge cases. Suddenly it's got quite slow. |
| |
| ▲ | yevbar 9 hours ago | parent [-] | | Edge cases certainly apply with scripts depending on specific git CLI args or stdout strings may not suffice with ziggit. _However_, for the use cases that most developers or agents are looking for, ziggit should have enough features covered. Happy to fix issues or bugs if that's not the case | | |
| ▲ | ARandumGuy 8 hours ago | parent [-] | | > _However_, for the use cases that most developers or agents are looking for What use cases are those? How did you determine that these are the use cases most developers/agents are looking for? For me, git has a ton of features that I rarely use. But when I need them, I really need them. Any replacement that doesn't cover these edge cases is fundamentally incomplete and insufficient, even if it works fine 99% of the time. |
|
|
|
| ▲ | delusional 10 hours ago | parent | prev [-] |
| > The bun team has already tested using git's C library and found it to be consistently slower hence resorting to literally executing the git CLI when performing bun install. I find that to be a much more remarkable claim. Git doesn't have a C library, and even if it did, In which world is literally shelling out faster than a C library call? I suppose libgit2 could be implemented poorly. If we follow their link[1] we get some clarity. It's an markdown (ai prompt?) file which just states it. Apparently they've "microbenchmarked" creating new git repositories. I really wonder if creating new git repositories is really on their hot path, but whatever. Where does that claim in that random markdown file come from then? Well apparently, 3 years ago when somebody was "restructuring docs"[2] and just made it up. I guess there really is a class of "engineers" that the AI can replace. [1]: https://github.com/oven-sh/bun/blob/3ed4186bc8db8357c670307f...
[2]: https://github.com/oven-sh/bun/commit/011e157cac7698050370e2... |
| |
| ▲ | llimllib 9 hours ago | parent | next [-] | | libgit2 is not nearly as thoroughly tested as the git CLI is, and it is not actually hard to imagine that calling the git CLI to create new repos is faster than shelling out to a C library. Your comment does not seem to be in good faith, implying that they've made up the performance difference. There's a comment with a benchmark here: https://github.com/oven-sh/bun/blob/4760d78b325b62ee62d6e47b... referencing the commit where they removed the ability to link with libgit2 because it was slower. Having built a service on top of libgit2, I can say that there are plenty of tricky aspects to using the library and I'm not at all surprised that bun found that they had to shell out to the CLI - most people who start building on libgit2 end up doing so. I don't know what the bun team actually did or have details - but it seems completely plausible to me that they found the CLI faster for creating repositories. | | |
| ▲ | yevbar 9 hours ago | parent | next [-] | | I'm actually assured to hear the git CLI is better covered than libgit2 since the CLI test suite is what I used as my "validation" for progress on meeting git's functionality As for what happened with Bun and libgit2, my best guess honestly is smth to do with zig-c interops but don't doubt there are optimizations everywhere to be done | |
| ▲ | delusional 7 hours ago | parent | prev [-] | | > Your comment does not seem to be in good faith, implying that they've made up the performance difference. I believe I have accurately represented what the article says. Had the article provided the comment you have just linked, I would have commented on that as well. I did not intend to imply that they manufactured the performance difference, merely that they don't know what they are talking about. The thought I have in my head is that they are incompetent, not that they are malicious. I wholeheartedly agree that libgit2 is full of footguns, that's why it matters that it's not actually "git's own C library" but a separate project. I also agree that you usually end up shelling out to git, exactly because of those problems libgit2 has. If those problems aren't speed though, and I don't think they are, the blog post would have to cover how this reimplementation of libgit2 avoids those problems. I'm not here to litigate if bun would be faster with libgit2. I am however here to make the argument that the blogpost does not make a convincing argument for why libgit2 isn't good enough. |
| |
| ▲ | yevbar 9 hours ago | parent | prev [-] | | Bun's attempted to integrate with libgit2 instead of spawning calls to the git CLI and found it to be consistently 3x slower iirc The micro-benchmarks are for the internal git operations that bun rn delegates to CLI calls. Overall, network time (ie round trip to GitHub and back) is what balances the performance when evaluating `bun install` but there are still places where ziggit has better visible wins like on arm-based Macs https://github.com/hdresearch/ziggit/blob/master/BENCHMARKS.... | | |
| ▲ | delusional 7 hours ago | parent [-] | | I don't know what that "BENCHMARKS" document is supposed to show. When I try to replicate their results I'm getting wildly faster executions of standard git, and they don't provide enough details for me to theorize why. I also noticed that their version of the "blame src/main.zig" command doesn't actually work (it shows all lines as not being committed). Sure, it's easy to optimize an algorithm if you just don't do the work. Git does indeed take longer, but at least it actually gives you a blame of the file. | | |
|
|