| ▲ | sothatsit 2 hours ago | |
That is why a fully automated firm would be a paradigm shift. Instead of requiring someone to be responsible and to QA things, you just let AI systems be responsible internally, and the company responsible as a whole for legal concerns. This idea of an automated firm relies on the premise that AI will become more capable and reliable than people. | ||
| ▲ | twosdai an hour ago | parent [-] | |
In this regard, the company cannot be created where there is not a single person tied to it, at least legally, even shell corporations have a person on the record as being responsible. So there needs to be some human that is apart of it, and in any "normal" organization if there is a person tied to the outcome of the company they presumably care about it and if the AI 99.99% of the time does good work, but still can make mistakes, a person will still be checking off on all its work. Which leads to a system of people reviewing and signing off on work, not exactly a fully autonomous firm. | ||