| ▲ | nehalem 5 hours ago |
| Whenever I see another supposedly menial device including enough general purpose hardware to run Doom, I wonder whether I should think of that as a triumph of software over hardware or an economic failure to build cheaper purpose-built hardware for things like sending audio over a radio. |
|
| ▲ | Aurornis 2 hours ago | parent | next [-] |
| > Whenever I see another supposedly menial device including enough general purpose hardware The PineBuds are designed and sold as an open firmware platform to allow software experimentation, so there’s nothing bad nor any economic failures going on here. Having a powerful general purpose microcontroller to experiment with is a design goal of the product. That said, ANC Bluetooth earbuds are not menial products. Doing ANC properly is very complicated. It’s much harder than taking the input from a microphone, inverting the signal, and feeding it into the output. There’s a lot of computation that needs to be done continuously. Using a powerful microcontroller isn’t a failure, it’s a benefit of having advanced semiconductor processes. Basically anything small and power efficient on a modern process will have no problem running at tens of MHz speeds. You want modern processes for the battery efficiency and you get speed as a bonus. The speed isn’t wasted, either. Higher clock speeds means lower latency. In a battery powered device having an MCU running at 48MHz may seem excessive until you realize that the faster it finishes every unit of work the sooner it can go to sleep. It’s not always about raw power. Modern earbuds are complicated. Having a general purpose MCU to allow software updates is much better than trying to get the entire wireless stack, noise cancellation, and everything else completely perfect before spinning out a custom ASIC. We’re very fortunate to have all of this at our disposal. The groveling about putting powerful microcontrollers into small things ignores the reality of how hard it is to make a bug-free custom ASIC and break even on it relative to spending $0.10 per unit on a proven microcontroller manufacturer at scale. |
|
| ▲ | rogerrogerr 5 hours ago | parent | prev | next [-] |
| Or a third option - an economic success that economies of scale have made massively capable hardware the cheapest option for many applications, despite being overkill. |
| |
| ▲ | AlecSchueler 2 hours ago | parent | next [-] | | Or the fourth option, an environmental disaster all around | | |
| ▲ | dubbie99 2 hours ago | parent | next [-] | | The materials that go into a chip are nothing. The process of making the chip is roughly the same no matter the power of it. So having one chip that can satisfy a large range of customers needs is so much better than wasting development time making a custom just good enough chip for each. | | |
| ▲ | AlecSchueler an hour ago | parent [-] | | > The materials that go into a chip are nothing. They really aren't. Every material that goes into every chip needs to be sourced from various mines around the world, shipped to factories to be assembled, then the end goods need to be shipped again around the world to be sold or directly dumped. High power, low power, it all has negative environmental impact. | | |
| ▲ | cortesoft 14 minutes ago | parent | next [-] | | That doesn't contradict the point, though. The negative impact on the environment is not reduced by making a less powered chip. | |
| ▲ | direwolf20 an hour ago | parent | prev [-] | | Which materials are they and how would you suggest doing it with fewer materials? | | |
| ▲ | serf 41 minutes ago | parent | next [-] | | ultra pure water production itself is responsible for untold amounts of hydroflouric acid and ammonia , and most etching processes have an F-Gas involved, and most plants that do this work have tremendously high energy (power) costs due to stability needs/hvac. it's not 'just sand'. | | | |
| ▲ | stackghost an hour ago | parent | prev [-] | | In theory, graphene based semiconductors would eliminate a lot of need for shipping and mining. |
|
|
| |
| ▲ | Aurornis an hour ago | parent | prev | next [-] | | It’s the opposite. Using an off the shelf MCU is much more efficient than trying to spin your own ASIC. Doing the work in software allows for updates and bug fixes, which are more likely to prevent piles of hardware from going into the landfill (in some cases before they even reach customers’ hands). | |
| ▲ | compiler-devel an hour ago | parent | prev [-] | | Nobody cares, unless they’re commenting for an easy win on internet message boards. |
| |
| ▲ | cyberrock 4 hours ago | parent | prev [-] | | Also see: USB 3+ e-marker chips. I'm still waiting for a Doom port on those. |
|
|
| ▲ | pibaker 33 minutes ago | parent | prev | next [-] |
| You should see it as the triumph of chip manufacturing — advanced, powerful MCUs have became so cheap thanks to manufacturing capabilities and economies of scale means it is now cheaper to use a mass manufactured general purpose device that may take more material to manufacture than a simpler bespoke device that will be produced at low volumes. You might be wondering "how on earth a more advanced chip can end up being cheaper." Well, it may surprise you but not all cost in manufacturing is material cost. If you have to design a bespoke chip for your earbuds, you need to now hire chip designers, you need to go through the whole design and testing process, you need to get someone to make your bespoke chip in smaller quantities which may easily end up more expensive than the more powerful mass manufactured chips, you will need to teach your programmers how to program on your new chip, and so on. The material savings (which are questionable — are you sure you can make your bespoke chip more efficiently than the mass manufactured ones?) are easily outweighed by business costs in other parts of the manufacturing process. |
|
| ▲ | tt24 3 hours ago | parent | prev | next [-] |
| Incredible to see people try to spin the wild successes of market based economies as an economic failure. Hardware is cheap and small enough that we can run doom on an earbud, and I’m supposed to think this is a bad thing? |
| |
| ▲ | hashmap 3 hours ago | parent [-] | | I can sort of see one angle for it, and the parent story kind of supports it. Bad software is a forcing function for good hardware - the worse that software has gotten in the past few decades the better hardware has had to get to support it. Such that if you actually tried like OP did, you can do some pretty crazy things on tiny hardware these days. Imagine what we could do on computers if they weren't so bottlenecked doing things they don't need to do. |
|
|
| ▲ | the_fall an hour ago | parent | prev | next [-] |
| > economic failure to build cheaper purpose-built hardware for things like sending audio over a radio. You're literally just wasting sand. We've perfected the process to the point where it's inexpensive to produce tiny and cheap chips that pack more power than a 386 computer. It makes little difference if it's 1,000 transistors or 1,000,000. It gets more complicated on the cutting edge, but this ain't it. These chips are probably 90 nm or 40 nm, a technology that's two decades old, and it's basically the off-ramp for older-generation chip fabs that can no longer crank out cutting-edge CPUs or GPUs. Building specialized hardware for stuff like that costs a lot more than writing software that uses just the portions you need. It requires deeper expertise, testing is more expensive and slower, etc. |
|
| ▲ | varjag 5 hours ago | parent | prev | next [-] |
| Earbuds often have features like mic beam forming and noise cancellation which require a substantial degree of processing power. It's hardly unjustified compared to your Teams instance making fans spin or Home Assistant bringing down an RPi to its knees. |
| |
| ▲ | nehalem 5 hours ago | parent [-] | | No doubt, maybe should I have emphasised the "general" part of "general purpose" more. Not a hardware person myself, I wonder whether there would be purpose-built hardware that could do the same more cheaply – think F(P)GA. | | |
| ▲ | Aurornis an hour ago | parent | next [-] | | > I wonder whether there would be purpose-built hardware that could do the same more cheaply – think F(P)GA. FPGAs are not cost efficient at all for something like this. MCUs are so cheap that you’d never get to a cheaper solution by building out a team to iterate on custom hardware until it was bug free and ready to scale. You’d basically be reinventing the MCU that can be bought for $0.10, but with tens of millions of dollars of engineering and without economies of scale that the MCU companies have. | |
| ▲ | nicoburns 19 minutes ago | parent | prev [-] | | > I wonder whether there would be purpose-built hardware that could do the same more cheaply Where are you imagining costy savings coming from? Custom anything is almost always vastly more expensive than using a standardised product. |
|
|
|
| ▲ | TrainedMonkey 5 hours ago | parent | prev | next [-] |
| > CPU: Dual-core 300MHz ARM Cortex-M4F It's absolute bonkers amount of hardware scaling that happened since Doom was released. Yes, this is a tremendous overkill here, but the crazy part here is that this fits into an earpiece. |
| |
| ▲ | wolvoleo 14 minutes ago | parent | next [-] | | Yes but also Doom is very very old. I bought a kodak camera in 2000 (640x480 resolution) and even that could run Doom on it. Way back when. Actually playable with sounds and everything. Here's an even older one running it: https://m.youtube.com/watch?v=k-AnvqiKzjY | |
| ▲ | Telemakhos 5 hours ago | parent | prev | next [-] | | I remember playing Doom on a single-core 25MHz 486 laptop. It was, at the time, an amazing machine, hundreds of times more powerful than the flight computer that ran the Apollo space capsule, and now it is outclassed by an earbud. | | |
| ▲ | iberator 3 hours ago | parent | next [-] | | Can we finally end this Apollo computer comparison forever? It was a real time computer NOT designed for speed but real time operations.1 Why don't you compare it to let's say pdp11, vax780/11 or Cray 1 supercomputer? NASA used a lot of supercomputers here on earth pior to mission start. | | |
| ▲ | mlyle 2 hours ago | parent [-] | | > It was a real time computer NOT designed for speed but real time operations. More than anything, it was designed to be small and use little power. But these little ARM Cortex M4F that we're comparing to are also designed for embedded, possibly hard-real-time operations. And dominant factors in experience on playback through earbuds are response time and jitter. If the AGC could get a capsule to the moon doing hard real-time tasks (and spilling low priority tasks as necessary), a single STM32F405 with a Cortex M4F could do it better. Actually, my team is going to fly a STM32F030 for minimal power management tasks-- but still hard real-time-- on a small satellite. Cortex-M0. It fits in 25 milliwatts vs 55W. We're clocked slow, but still exceed the throughput of the AGC by ~200-300x. Funnily enough, the amount of RAM is about the same as the AGC :D It's 70 cents in quantity, but we have to pay three whole dollars at quantity 1. > NASA used a lot of supercomputers here on earth pior to mission start. Fine, let's compare to the CDC 6600, the fastest computer of the late 60's. M4F @ 300MHz is a couple hundred single precision megaflops; CDC6600 was like 3 not-quite-double-precision megaflops. The hacky "double single precision" techniques have comparable precision-- figure that is probably about 10x slower on average, so each M4F could do about 20 CDC-6600 equivalent megaflops or is roughly 5-10x faster. The amount of RAM is about the same on this earbud. His 486-25 -- if a DX model with the FPU -- was probably roughly twice as fast as the 6600 and probably had 4x the RAM, and used 2 orders of magnitude less power and massed 3 orders of magnitude less. Control flow, integer math, etc, being much faster than that. Just a few more pennies gets you a microcontroller with a double precision FPU, like a Cortex-M7F with the FPv4-SP-D16, which at 300MHz is good for maybe 60 double precision megaflops-- compared to the 6600, 20x faster and more precision. |
| |
| ▲ | tadfisher 2 hours ago | parent | prev [-] | | And perhaps more fittingly, that PC couldn't decode and play an MP3 in real time. |
| |
| ▲ | mlyle 3 hours ago | parent | prev [-] | | This is the "little part" of what fits into an earpiece. Each of those cores is maybe 0.04 square millimeters of die on e.g. 28nm process. RAM takes some area, but that's dwarfed by the analog and power components and packaging. The marginal cost of the gates making up the processors is effectively zero. |
|
|
| ▲ | danielbln 5 hours ago | parent | prev | next [-] |
| It's already very cheap to build though. We are able to pack a ton of processing into a tiny form factor for little money (comparatively, ignoring end-consumer margins etc.). An earbud that does ANC, supports multiple different audio standard including low battery standby, is somewhat resistant to interference, can send and receive over many meters. That's awesome for the the price. That it has enough processing to run a 33 year old game.. well, that's just technological progression. A single modern smartphone has more compute than all global conpute of 1980 combined. |
| |
| ▲ | ck2 2 hours ago | parent [-] | | I need that in lunar-lander exponents (imagine the lunar lander computer being an earbud ha) | | |
| ▲ | danielbln an hour ago | parent [-] | | Well, current smartphone would be about 10^8 times faster/more than the lunar lander. A single Airpod would be about 10^4 times as powerful as the entire lunar lander guidance system. Or to put another way: a single Airpod would outcompute the entire Soviet Union's space program. |
|
|
|
| ▲ | Waterluvian 2 hours ago | parent | prev | next [-] |
| I imagine it’s far more economical to have one foundry that can make a general purpose chip that’s overpowered for 95% of uses than to try to make a ton of different chips. It speaks to how a lot of the actual cost is the manufacturing and R&D. |
| |
| ▲ | sdenton4 an hour ago | parent [-] | | The only real problem I could see is if the general purpose microcontroller is significantly more power-hungry than a specialized chip, impacting the battery life of the earbuds. On every other axis, though, it's likely a very clear win: reusable chips means cheaper units, which often translates into real resource savings (in the extreme case, it may save an entire additional factory for the custom chips, saving untold energy and effort.) |
|
|
| ▲ | gpm 3 hours ago | parent | prev | next [-] |
| Neither - it's a triumph of our ability to do increasing complex things in both software and hardware. An earbud should be able to make good use of the extra computing capacity, whether it is to run more sophisticated compression saving bandwidth, or for features like more sophisticated noise cancelling/microphone isolation algorithms. There are really very few devices that shouldn't be able to be better given more (free) compute. It's also a triumph of the previous generation of programmers to be able to make interesting games that took so little compute. |
| |
| ▲ | buildbot 3 hours ago | parent | next [-] | | Plus there’s actually less waste, I would imagine, by using a generic, very efficiently mass produced, but way overkill part. vs. a one off or very specific, rare but perfectly matched part. | |
| ▲ | echelon 3 hours ago | parent | prev [-] | | There are enough atoms in that earbud to replace all of the world's computers. We've got a long way to go. |
|
|
| ▲ | tobinc 2 hours ago | parent | prev | next [-] |
| I think it's just indicative of the fact that general purpose hardware has more applications, and can thus be mass produced for cheaper at a greater scale and used for more applications. |
|
| ▲ | __MatrixMan__ 5 hours ago | parent | prev | next [-] |
| If it can run Doom it can run malware. |
|
| ▲ | mlyle 3 hours ago | parent | prev | next [-] |
| Marginal cost of a small microprocessor in an ASIC is nothing. The RAM costs a little bit, but if you want to firmware update in a friendly way, etc, you need some RAM to stage the updates. |
|
| ▲ | notarobot123 3 hours ago | parent | prev | next [-] |
| It's intuitive to think of wasted compute capacity as correlating with a waste of material resources. Is this really the case though? |
| |
| ▲ | mathgeek 2 hours ago | parent [-] | | Waste is subjective or, at best, hard to define. It's the classic "get rid of all the humans and nothing would be wasted" aphorism. |
|
|
| ▲ | daft_pink 2 hours ago | parent | prev [-] |
| If you look at the bottom of the page, it’s an advertisement for someone looking for a job to show off his technical skill. |
| |