Remix.run Logo
Den_VR 19 hours ago

As I recall, Gartner made the outrageous claim that upwards of 70% of all computing will be “AI” in some number of years - nearly the end of cpu workloads.

deliciousturkey 16 hours ago | parent | next [-]

I'd say over 70% of all computing is already been non-CPU for years. If you look at your typical phone or laptop SoC, the CPU is only a small part. The GPU takes the majority of area, with other accelerators also taking significant space. Manufacturers would not spend that money on silicon, if it was not already used.

goku12 15 hours ago | parent | next [-]

> I'd say over 70% of all computing is already been non-CPU for years.

> If you look at your typical phone or laptop SoC, the CPU is only a small part.

Keep in mind that the die area doesn't always correspond to the throughput (average rate) of the computations done on it. That area may be allocated for a higher computational bandwidth (peak rate) and lower latency. Or in other words, get the results of a large number of computations faster, even if it means that the circuits idle for the rest of the cycles. I don't know the situation on mobile SoCs with regards to those quantities.

deliciousturkey 15 hours ago | parent [-]

This is true, and my example was a very rough metric. But the computation density per area is actually way, way higher on GPU's compared to CPU's. CPU's only spend a tiny fraction of their area doing actual computation.

swiftcoder 12 hours ago | parent | prev | next [-]

> If you look at your typical phone or laptop SoC, the CPU is only a small part

In mobile SoCs a good chunk of this is power efficiency. On a battery-powered device, there's always going to be a tradeoff to spend die area making something like 4K video playback more power efficient, versus general purpose compute

Desktop-focussed SKUs are more liable to spend a metric ton of die area on bigger caches close to your compute.

PunchyHamster 15 hours ago | parent | prev [-]

If going by raw operations done, if the given workload uses 3d rendering for UI that's probably true for computers/laptops. Watching YT video is essentially CPU pushing data between internet and GPU's video decoder, and to GPU-accelerated UI.

yetihehe 16 hours ago | parent | prev [-]

Looking at home computers, most of "computing" when counted as flops is done by gpus anyway, just to show more and more frames. Processors are only used to organise all that data to be crunched up by gpus. The rest is browsing webpages and running some word or excel several times a month.