▲ | bananapub 4 days ago | |
I think something everyone is underpricing in our area is that LLMs are uniquely useful for writing code for programmers. it's a very constrained task, you can do lots of reliable checking on the output at low cost (linters, formatters, the compiler), the code is mostly reviewed by a human before being committed, and there's insulation between the code and the real world, because ultimately some company or open source project releases the code that's then run, and they mostly have an incentive to not murder people (Telsa except, obviously). it seems like lots of programmers are then taking that information and then deeply overestimating how useful it is at anything else, and these programmers - and the marketing people who employ them - are doing enormous harm by convincing e.g. HR departments that it is of any value to them for dealing with complaints, or much much more danderously, convincing governments that it's useful for how they deal with humans asking for help. this misconception (and deliberate lying by people like OpenAI) is doing enormous damage to society and is going to do much much more. |