| ▲ | pennomi 3 days ago |
| For sure. The more specialized or obscure of things you have to do, the less LLMs help you. Building a simple marketing website? Probably don’t waste your time - an LLM will probably be faster. Designing a new SLAM algorithm? Probably LLMs will spin around in circles helplessly. That being said, that was my experience several years ago… maybe state of the art has changed in the computer vision space. |
|
| ▲ | heytakeiteasy 3 days ago | parent | next [-] |
| > The more specialized or obscure of things you have to do, the less LLMs help you. I've been impressed by how this isn't quite true. A lot of my coding life is spent in the popular languages, which the LLMs obviously excel at. But a random dates-to-the-80s robotics language (Karel)? I unfortunately have to use it sometimes, and Claude ingested a 100s of pages long PDF manual for the language and now it's better at it than I am. It doesn't even have a compiler to test against, and still it rarely makes mistakes. I think the trick with a lot of these LLMs is just figuring out the best techniques for using them. Fortunately a lot of people are working all the time to figure this out. |
| |
| ▲ | jatora 3 days ago | parent [-] | | Agreed. This sentiment you are replying to is a common one and is just people self-aggrandizing. No, almost nobody is working on code novel enough to be difficult for an LLM. All code projects build on things LLM's understand very well. Even if your architectural idea is completely unique... a never before seen magnum opus, the building blocks are still legos. | | |
|
|
| ▲ | monsieurbanana 3 days ago | parent | prev | next [-] |
| Specialized is probably not the word I'd use, because llms are generally useful to understand more specialized / obscure topics. For example I've never randomly heard people talking about the dicom standard, llms have no trouble with it. |
| |
| ▲ | phil21 3 days ago | parent | next [-] | | I think there is a sweet spot for the training(?) on these LLMs where there is basically only "professional" level documentation and chatter, without the layman stuff being picked up from reddit and github/etc. I was looking at trying to remember/figure out some obscure hardware communication protocol to figure out enumeration of a hardware bus on some servers. Feeding codex a few RFC URLs and other such information, plus telling it to search the internet resulted in extremely rapid progress vs. having to wade through 500 pages of technical jargon and specification documents. I'm sure if I was extending the spec to a 3.0 version in hardware or something it would not be useful, but for someone who just needs to understand the basics to get some quick tooling stood up it was close to magic. | |
| ▲ | i_cannot_hack 3 days ago | parent | prev | next [-] | | The standard for obscurity is different for LLMs, something can be very widespread and public without the average person knowing about it. DICOM is used at practically every hospital in the world, there's whole websites dedicated to browsing the documentation, companies employ people solely for DICOM work, there's popular maintained libraries for several different languages, etc, so the LLM has an enormous amount of it in its training data. The question relevant for LLMs would be "how many high quality results would I get if I googled something related to this", and for DICOM the answer is "many". As long the that is the case LLMs will not have trouble answering questions about it either. | |
| ▲ | aleph_minus_one 3 days ago | parent | prev [-] | | > llms are generally useful to understand more specialized / obscure topics A very simple kind of query that in my experiences causes problems to many current LLMs is: "Write {something obscure} in the Wolfram programming language." | | |
| ▲ | AlotOfReading 3 days ago | parent [-] | | One tendency I've noticed is that LLMs struggle with creativity. If you give them a language with extremely powerful and expressive features, they'll often fail to use them to simplify other problems the way a good programmer does. Wolfram is a language essentially designed around that. I wasn't able to replicate in my own testing though. Do you know if it also fails for "mathematica" code? There's much more text online about that. | | |
| ▲ | aleph_minus_one 3 days ago | parent [-] | | > Do you know if it also fails for "mathematica" code? My experience concerning using "Mathematica" instead of "Wolfram" in AI tasks is similar. |
|
|
|
|
| ▲ | someguynamedq 3 days ago | parent | prev | next [-] |
| Several years ago is ancient with the rate of advancement that LLMs have had recently |
|
| ▲ | archagon 3 days ago | parent | prev [-] |
| > Building a simple marketing website? Probably don’t waste your time - an LLM will probably be faster. This is actually where I would be most reluctant to use an LLM. Your website represents your product, and you probably don’t want to give it the scent of homogenized AI slop. People can tell. |
| |
| ▲ | pennomi 2 days ago | parent [-] | | They can tell if you let it use whatever CSS it wants (Claude will nearly always make a purple or blue website with gross rainbow gradients). They can also tell if you let it write your marketing copy. If you decide on your own brand colors and wording, there’s very little left about the code that can’t be done instantly by an LLM (at least on a marketing website). | | |
| ▲ | Starman_Jones 18 hours ago | parent [-] | | I just read Claude's front-end design instructions, and it now explicitly bans purple gradients. Curious to see what new pattern it will latch on to. |
|
|