Remix.run Logo
AnthonyMouse 3 days ago

> By prioritizing efficiency, Apple also prioritizes integration. The PC ecosystem prefers less integration (separate RAM, GPU, OS, etc) even at the cost of efficiency.

People always say this but "integration" has almost nothing to do with it.

How do you lower the power consumption of your wireless radio? You have a network stack that queues non-latency sensitive transmissions to minimize radio wake-ups. But that's true for radios in general, not something that requires integration with any particular wireless chip.

How do you lower the power consumption of your CPU? Remediate poorly written code that unnecessarily keeps the CPU in a high power state. Again not something that depends on a specific CPU.

How much power is saved by soldering the memory or CPU instead of using a socket? A negligible amount if any; the socket itself has no significant power draw.

What Apple does well isn't integration, it's choosing (or designing) components that are each independently power efficient, so that then the entire device is. Which you can perfectly well do in a market of fungible components simply by choosing the ones with high efficiency.

In fact, a major problem in the Android and PC laptop market is that the devices are insufficiently fungible. You find a laptop you like where all the components are efficient except that it uses an Intel processor instead of the more efficient ones from AMD, but those components are all soldered to a system board that only takes Intel processors. Another model has the AMD APU but the OEM there chose poorly for the screen.

It's a mess not because the integration is poor but because the integration exists instead of allowing you to easily swap out the part you don't like for a better one.

adgjlsfhk1 3 days ago | parent [-]

> How much power is saved by soldering the memory or CPU instead of using a socket? A negligible amount if any; the socket itself has no significant power draw.

This isn't quite true. When the whole chip is idling at 1-2W, 0.1W of socket power is 10%. Some of Apple's integration almost certainly save power (e.g. putting storage controllers for the SSD on the SOC, having tightly integrated display controllers, etc).

AnthonyMouse 3 days ago | parent [-]

> When the whole chip is idling at 1-2W, 0.1W of socket power is 10%.

But how are you losing 10% of power to the socket at idle? Having a socket might require traces to be slightly longer but the losses to that are proportional to overall power consumption, not very large, and both CPU sockets and the new CAMM memory standard are specifically designed to avoid that anyway (primarily for latency rather than power reasons because the power difference is so trivial).

> Some of Apple's integration almost certainly save power (e.g. putting storage controllers for the SSD on the SOC, having tightly integrated display controllers, etc).

This isn't really integration and it's very nearly the opposite: The primary advantage here in terms of hardware is that the SoC is being fabbed on 3nm and then the storage controller would be too, which would be the same advantage if you would make an independent storage controller on the same process.

Which is the problem with PCs again: The SSDs are too integrated. Instead of giving the OS raw access to the flash chips, they adhere a separate controller just to do error correction and block remapping, which could better be handled by the OS on the main CPU which is fabbed on a newer process or, in larger devices with a storage array, a RAID controller that performs the task for multiple drives at once.

And which would you rather have, a dual-core ARM thing integrated with your SSD, or the same silicon going to two more E-cores on the main CPU which can do the storage work when there is any but can also run general purpose code when there isn't?