for ai workloads? You're wrong. I use mine as a server, just ssh into it. I don't even have a keyboard or display hooked up to it.
You can get 96gb of vram and about 40-70% the speed of a 4090 for $4000.
Especially when you are running a large number of applications you want to talk to each other it makes sense ... the only way to do it on a 4090 is to hit disk, shut the application down, start up the other applciation, read from disk ... it's slowwww... the other option is a multi-gpu system but then it gets into real money.
trust me, it's a gamechanger. I just have it sitting in a closet. Use it all the time.
The other nice thing is unlike with any Nvidia product, you can walk into an apple store, pay the retail price and get it right away. No scalpers, no hunting.