▲ | mavamaarten 2 days ago | |
So true. We used to appoint someone in the group to take notes. These notes were always correct, to the point, short and easy to read. Now our manager(s) are heavily experimenting with recording all meetings and desperately trying to produce useful reports using all sorts of AI tools. The output is always lengthy and makes the manager super happy. Look, amazing reports! But on closer inspection they're consistently incomplete one way or another, sometimes confidently incorrect and full of happy corpo mumbo jumbo. More slop to wade through, when looking for factual information later on. Our manager is so happy to report that he's using AI for everything. Even in cases where I think completeness and correctness is important. I honestly think it's scary how quickly that desire for correctness is gone and replaced with "haha this is cool tech". Us devs are much more reluctant. We don't want to fall behind, but in the end when it comes to correctness and accountability, we're the ones responsible. So I won't brainlessly dump my work into an LLM and take its word for granted. | ||
▲ | ElevenLathe 2 days ago | parent [-] | |
It's their company; we just work at it. If we want to exert more control in the workplace, we obviously need more power in the workplace. In the meantime, if they want the equivalent of their company's prefrontal cortex to be burned out with a soldering iron, that's their prerogative. |