▲ | mgraczyk 9 days ago | |
Thanks for the reply! At large companies with many designers, a lot of time is spent coordinating and planning. LLMs can already help with that. As far as design/copilot goes, I think there are reasons to be much more optimistic. Existing models haven't seen much Verilog. With better training data it's reasonable to expect that they will improve to perform at least as well on Verilog as they do on python. But even if there is a 10% chance it's reasonable for VCs to invest in these companies. | ||
▲ | bsder 8 days ago | parent | next [-] | |
> With better training data it's reasonable to expect that they will improve to perform at least as well on Verilog as they do on python. There simply isn't enough of that code in existence. Writing Verilog code is about mapping the constructs onto your theory of mind about the underlying hardware. If that were easy, so many engineers wouldn't have so much trouble writing Verilog code that doesn't have faults. You can't write Verilog code just by pasting together Stack Overflow snippets. Look at the confusion that happens when programmers take their "for-loop" understanding into the world of GPU shaders or HDLs (hardware description languages) where "for-loops" map to hardware and suddenly are both finite and fixed. LLMs exhibit the exact same confusion--only worse. | ||
▲ | catlifeonmars 9 days ago | parent | prev [-] | |
I’m actually curious if there even is a large enough corpus of Verilog out there. I have noticed that even tools like Copilot tend to perform poorly when working with DSLs that are majority open source code (on GitHub no less!) where the practical application is niche. To put this in other terms, Copilot appears to _specialize_ on languages, libraries and design patterns that have wide adoption, but does not appear to be able to _generalize_ well to previously unseen or rarely seen languages, libraries, or design patterns. Anyway that’s largely anecdata/sample size of 1, and it could very well be a case of me holding the tool wrong, but that’s what I observed. |