| ▲ | borski 3 hours ago |
| LLMs can build anything. The real question is what is worth building, and how it’s delivered. That is what is still human. LLMs, by nature of not being human, cannot understand humans as well as other humans can. (See every attempt at using an LLM as a therapist) In short: LLMs will eventually be able to architect software. But it’s still just a tool |
|
| ▲ | silisili 3 hours ago | parent | next [-] |
| What is the use of software eng/architect at that point? It's a tool, but one that product or C levels can use directly as I see it? |
| |
| ▲ | borski 3 hours ago | parent | next [-] | | Yes, for building something But for building the right thing? Doubtful. Most of a great engineer’s work isn’t writing code, but interrogating what people think their problems are, to find what the actual problems are. In short: problem solving, not writing code. | | |
| ▲ | mattmanser an hour ago | parent [-] | | Where's this delusion come from recently that great engineers didnt write code? What a load of crap. All you're doing is describing a different job role. What you're talking about is BA work, and a subset of engineers are great at it, but most are just ok. You're claiming a part of the job that was secondary, and not required, is now the whole job. | | |
| ▲ | borski an hour ago | parent [-] | | I never said great engineers didn’t write code. But writing the code was never the point. The point has always been delivering the product to the customer, in any industry. Code is rarely the deliverable. That’s my point. |
|
| |
| ▲ | 0xbadcafebee 3 hours ago | parent | prev [-] | | A software engineer will be a person who inspects the AI's work, same as a building inspector today. A software architect will co-sign on someone's printed-up AI plans, same as a building architect today. Some will be in-house, some will do contract work, and some will be artists trying to create something special, same as today. The brute labor is automated away, and the creativity (and liability) is captured by humans. |
|
|
| ▲ | roncesvalles 2 hours ago | parent | prev [-] |
| FWIW I find LLMs to be excellent therapists. The commercial solutions probably don't work because they don't use the best SOTA models and/or sully the context with all kinds of guardrails and role-playing nonsense, but if you just open a new chat window in your LLM of choice (set to the highest thinking paid-tier model), it gives you truly excellent therapist advice. In fact in many ways the LLM therapist is actually better than the human, because e.g. you can dump a huge, detailed rant in the chat and it will actually listen to (read) every word you said. |
| |
| ▲ | borski 2 hours ago | parent [-] | | Please, please, please don’t make this mistake. It is not a therapist. At best, it might be a facsimile of a life coach, but it does not have your best interests in mind. It is easy to convince and trivial to make obsequious. That is not what a therapist does. There’s a reason they spend thousands of hours in training; that is not an exaggeration. Humans are complex. An LLM cannot parse that level of complexity. | | |
| ▲ | roncesvalles 2 hours ago | parent | next [-] | | You seem to think therapists are only for those in dire straits. Yes, if you're at that point, definitely speak to a human. But there are many ordinary things for which "drop-in" therapist advice is also useful. For me: mild road rage, social anxiety, processing embarrassment from past events, etc. The tools and reframing that LLMs have given me (Gemini 3.0/3.1 Pro) have been extremely effective and have genuinely improved my life. These things don't even cross the threshold to be worth the effort to find and speak to an actual therapist. | | |
| ▲ | defrost 2 hours ago | parent | next [-] | | Which professional therapist does your Gemini 3.0/3.1 Pro model see? Do you think I could use an AI therapist to become a more effective and much improved serial killer? | |
| ▲ | borski 2 hours ago | parent | prev [-] | | I never said therapists were only for those in crisis; that is a misreading of my argument entirely. An LLM cannot parse the complexity of your situation. Period. It is literally incapable of doing that, because it does not have any idea what it is like to be human. Therapy is not an objective science; it is, in many ways, subjective, and the therapeutic relationship is by far the most important part. I am not saying LLMs are not useful for helping people parse their emotions or understand themselves better. But that is not therapy, in the same way that using an app built for CBT is not, in and of itself, therapy. It is one tool in a therapist’s toolbox, and will not be the right tool for all patients. That doesn’t mean it isn’t helpful. But an LLM is not a therapist. The fact that you can trivially convince it to believe things that are absolutely untrue is precisely why, for one simple example. |
| |
| ▲ | pzs an hour ago | parent | prev [-] | | While I agree with you, I also find that an LLM can help organize my thoughts and come to realizations that I just didn't get to, because I hadn't explained verbally what I am thinking and feeling. Definitely not a substitute for human interaction and relationships, which can be fulfilling in many-many ways LLM's are not, but LLM's can still be helpful as long as you exercise your critical thinking skills. My preference remains always to talk to a friend though. EDIT: seems like you made the same point in a child comment. | | |
| ▲ | borski an hour ago | parent [-] | | Yeah, I agree with all of that. A friend built an “emotion aware” coach, and it is extremely useful to both of us. But he still sees a therapist, regularly, because they are not the same and do not serve the same purpose. :) |
|
|
|