| ▲ | snozolli 3 hours ago | |
Back around 2005, I worked with a guy who was trying to position himself as the go-to expert on the team. He'd always jump at the chance to explain things to QA and the support team. We'd occasionally hear follow-up questions from those teams and realize that he was just making things up. He was also had a serious case of cargo-cult mentality. He'd see some behavior and ascribe it to something unrelated, then insist with almost religious fervor that things had to be coded in a certain way. He was also a yes-man who would instantly cave to whatever whim management indicated. We'd go into a meeting in full agreement that a feature being requested was damaging to our users, and he'd be nodding along with management like a bobble-head as they failed to grasp the problem. Management never noticed that he was constantly misleading other teams, or that he checked in flaky code he found on the Internet that triggered multiple days of developer time to debug. They saw him as a highly productive team player who was always willing to "help" others. He ended up promoted to management. Anyway, my point is that management seems to care primarily about having their ego boosted, and about seeing what they perceive as a hard worker, even if that worker is just spinning his wheels and throwing mud on everyone else. I'm sure that AI is only going to exacerbate this weird, counter-productive corporate system. | ||
| ▲ | switchbak 2 hours ago | parent | next [-] | |
I find it astounding how otherwise intelligent people fall for such obvious theatre. One really does need a particular mindset to filter this out, and that is almost entirely absent from typical management. As usual, if you don't have an actual reliable signal, or acquiring that signal takes too long - you'll fall back to relying on cheap proxy signals. Confidence over competence, etc. And those that are best at self-promotion and politics win. I've got recent experience in exactly this - someone who is completely out of their depth, mis-representing their actual capabilities. Their reliance on AI is so strong because of this lack of depth - to such a degree that they never learn anything. Lately they've been creating drama and endless discussions about dumb things to a) try to appear like they have strong opinions, and b) to filabust the time so they don't have to talk about important things related to their work output. | ||
| ▲ | ekropotin an hour ago | parent | prev | next [-] | |
> He ended up promoted to management. I bet, with such qualities he is VP by now. | ||
| ▲ | mannanj 3 hours ago | parent | prev [-] | |
Agreed. I mean, to me, it seems that the management tier level of people like what you described, are the people funding and marketing AI to the world. They want to maintain their status and position in the world, while lowering the value of the actual experts in the world and like this article says, feel confident in their impersonations of them. | ||