| ▲ | galaxyLogic 2 hours ago |
| Well one could say that since it's AI, AI should be able to tell us what we're doing wrong. No? AI is supposed to make our work easier. |
|
| ▲ | kace91 2 hours ago | parent [-] |
| What you are doing wrong in respect to what? If you ask for A, how would any system know that you actually wanted to ask for B? |
| |
| ▲ | walt_grata an hour ago | parent [-] | | Honestly IMO it's more that I ask for A, but don't strongly enough discourage B then I get both A, B and maybe C, generally implemented poorly. The base systems need to have more focus and doubt built in before they'll be truely useful for things aside from a greenfield apps or generating maintainable code. |
|