Remix.run Logo
robotnikman 3 days ago

GIGABYTE recently did this with their new 'AI' box

https://www.gigabyte.com/Graphics-Card/GV-N5090IXEB-32GD

derefr 3 days ago | parent [-]

That misses the "vertically integrated" part. (As does everything else right now, which was my point.)

The thing you linked is just a regular Gigabyte-branded 5090 PCIe GPU card (that they produced first, for other purposes; and which does fit into a regular x16 PCIe slot in a standard ATX chassis), put into a (later-designed) custom eGPU enclosure. The eGPU box has some custom cooling [that replaces the card's usual cooling] and a nice little PSU — but this is not any more "designing the card around the idea it'll be used in an enclosure" than what you'd see if an aftermarket eGPU integrator built the same thing.

My point was rather that, if an OEM [that produces GPU cards] were to design one of their GPU cards specifically and only to be shipped inside an eGPU enclosure that was designed together with it — then you would probably get higher perf, with better thermals, at a better price(!), than you can get today from just buying standalone peripheral-card GPU (even with the cost of the eGPU enclosure and the rest of its components taken into account!)

Where by "designing the card and the enclosure together", that would look like:

- the card being this weird nonstandard-form-factor non-card-edged thing that won't fit into an ATX chassis or plug into a PCIe slot — its only means of computer connection would be via its Thunderbolt controller

- the eGPU chassis the card ships in, being the only chassis it'll comfortably live in

- the card being shaped less like a peripheral card and more like a motherboard, like the ones you see in embedded industrial GPU-SoC [e.g. automotive LiDAR] use-cases — spreading out the hottest components to ensure nothing blocks anything else in the airflow path

- the card/board being designed to expose additional water-cooling zones — where these zones would be pointless to expose on a peripheral card, as they'd be e.g. on the back of the card, where the required cooling block would jam up against the next card in the slot-array

...and so on.

It's the same logic that explains why those factory-sealed Samsung T-series external NVMe pucks can cost less than the equivalent amount of internal m.2 NVMe. With m.2 NVMe, you're not just forced into a specific form-factor (which may not be electrically or thermally optimal), but you're also constrained to a lowest-common-denominator assumption of deployment environment in terms of cooling — and yet you have to ensure that your chips stay stable in that environment over the long term. Which may require more-expensive chips, longer QC burn-in periods, etc.

But when you're shipping an appliance, the engineering tolerances are the tolerances of the board-and-chassis together. If the chassis of your little puck guarantees some level of cooling/heat-sinking, then you can cheap out on chips without increasing the RMA rate. And so on. This can (and often does) result in an overall-cheaper product, despite that product being an entire appliance vs. a bare component!

wrs 3 days ago | parent | next [-]

Random observation: This is very similar to the rationale for the “trash can” Mac Pro.

justsomehnguy 3 days ago | parent | prev [-]

> Gigabyte-branded 5090 PCIe GPU

The hottest one on the consumer market

> The eGPU box has some custom cooling

Custom liquid cooling to tame the enormous TDP

> and a nice little PSU

Yeah, an 850W one.

>were to design one of their GPU cards specifically and only to be shipped inside an eGPU enclosure that was designed together with it

And why they would do so?

Do you understand what it would drive the price a lot?

> at a better price(!)

With less production/sales numbers than a regular 5090 GPU? No way. Economics 101.

> the card being this weird nonstandard-form-factor non-card-edged thing

Even if we skip the small series nuances (which makes this a non-starter by the price alone), there is a little what some other 'nonstandard-form-factor' can do for the cooling - you still need the RAM near the chip... and that's all. You just designed the same PCIe card for the sake of it being incompatible..

> won't ... plug into a PCIe slot

Again - why? What this would provide what the current PCIe GPU lacks? BTW you still need the 16 lines of PCIe and you know which connector provides the most useful and cost effective way to do so? A regular 16x PCIe connector. That one you ditched.

> the card being shaped less like a peripheral card and more like a motherboard

You don't need to 're-design it from scratch', it's enough not to be constrained with a 25cm limit to have a proper air-flow along a properly oriented radiator.

> why those factory-sealed Samsung T-series external NVMe pucks

Lol: https://www.zdnet.com/article/why-am-i-taking-this-samsung-t...