| ▲ | imiric 7 days ago |
| You can't be serious. Intel was well on its way to be a considerable threat to NVIDIA with their Arc line of GPUs, which are getting better and cheaper with each generation. Perhaps not in the enterprise and AI markets yet, but certainly on the consumer side. This news muddies this approach, and I see it as a misstep for both Intel and for consumers. Intel is only helping NVIDIA, which puts them further away from unseating them than they were before. Competition is always a net positive for consumers, while mergers are always a net negative. This news will only benefit shareholders of both companies, and Intel shareholders only in the short-term. In the long-term, it's making NVIDIA more powerful. |
|
| ▲ | tremon 7 days ago | parent | next [-] |
| I'm not convinced. The latest Battlemage benchmarks I've seen put the B580 at the same performance as the RTX 4060 (which is a two years old entry-level card) but with 50% more power consumption (80W vs 125W average). It's good to have more than one open source supporting graphics vendor, but I don't think Nvidia is losing any sleep over Intel's GPU offerings. |
| |
| ▲ | throwawaythekey 7 days ago | parent | next [-] | | Battlemage had the best perf/% and most the driver issues from Alchemist had been ironed out. Another generation or two of steady progress and intel have a big winner on their hands. Intel's foundry costs are probably competitive with nvidia too - nvidia has too much opportunity cost if nothing else. | | | |
| ▲ | imiric 6 days ago | parent | prev | next [-] | | The B580 was released in December 2024, and the 4060 in May 2023. So not quite a two year difference. While it doesn't quite compete at performance and power consumption, it does at price/performance and overall value. It is a $250 card, compared to the $300 of the 4060 at launch. You can still get it at that price, if there's stock, while the 4060 hovers around $400 now. It's also a 12GB card vs the 8GB of the 4060. So, sure, this is not competitive at the high-end segment, but it's remarkable what they've accomplished in just a few years, compared to the decades that AMD and NVIDIA have on them. It's definitely not far fetched to assume that the gap would only continue to close. Besides, Intel is not only competing at GPUs, but APUs, and CPUs. Their APU products are more performant and efficient than AMD's (e.g. 140V vs 890M). | |
| ▲ | bogwog 7 days ago | parent | prev | next [-] | | This is very short-sighted. The cards are improving, which can't really be said about AMD, the only other potential threat to Nvidia. It's also well known that Nvidia purposefully handicaps their consumer cards to avoid cannibalizing their enterprise cards. That means that the consumer market at least is not as efficient/optimal as it could be, so a competitor actually trying to compete (unlike AMD, apparently) should be able to do that without even having to out-innovate Nvidia or anything like that. Just get close on compute performance, but offer more VRAM or cheaper multi-gpu setups. | | |
| ▲ | fluoridation 6 days ago | parent [-] | | >cheaper multi-gpu setups Nah, nobody cares about that. Even in their heyday, SLI and CrossFire barely made sense technologically. That market is basically non-existent. There's more people now wanting to run multiple GPUs for inference than there ever were who were interested in SLI, and those people can mix and match GPUs as they like. |
| |
| ▲ | HDThoreaun 7 days ago | parent | prev [-] | | nvidia's margins are over 80% for datacenter products. If Intel can produce chips with enough vram and performance on par with nvidia from 2 years ago at 30% margins theyd steal a lot of business, if they can figure out the cuda side of things. |
|
|
| ▲ | JonChesterfield 7 days ago | parent | prev | next [-] |
| I'm sure Larrabee will be superb any year now. The Xeon phi will rise again. For supporting evidence, the success of Aurora. Weren't the loss-leading arc GPUs cancelled as well? Maybe that only one generation of them, it does look like some are on the market now. I think this partnership will damage nvidia. It might damage intel, but given they're circling the drain already, it's hard to make matters worse. It's probably bad for consumers in every dimension. Or to take the opposite, if nvidia rolled over intel and fired essentially everyone in the management chain and started trying to run the fabs themselves, good chance they'd turn the ship around and become even more powerful than they already are. |
| |
| ▲ | whatever1 7 days ago | parent | next [-] | | Has Nvidia has ran any fab successfully ? | | |
| ▲ | JonChesterfield 7 days ago | parent [-] | | Nope. It will/would be a learning curve. They'd probably seed it with strategic hires from TSMC. |
| |
| ▲ | imiric 7 days ago | parent | prev [-] | | > It might damage intel, but given they're circling the drain already, it's hard to make matters worse. How was Intel "circling the drain"? They have a very competitive offering of CPUs, APUs, and GPUs, and the upcoming Panther Lake and Nova Lake architectures are very promising. Their products compete with AMD, NVIDIA, and ARM SoCs from the likes of Apple. Intel may have been in a rut years ago, but they've recovered incredibly well. This is why I'm puzzled by this decision, and as a consumer, I would rather use a fully Intel system than some bastardized version that also involves NVIDIA. We've seen how well that works with Optimus. | | |
| ▲ | JonChesterfield 7 days ago | parent | next [-] | | None of their products are competitive, they fired the CEO who was meant to save them, fired tens of thousands of their engineers, sold off massive chunks of the company, they're still bleeding money and begging for state support? Also their network cards no longer work properly which is deeply aggravating as that used to be something I could rely on, just bought some realtek ones to work around the intel ones falling over. | | |
| ▲ | imiric 7 days ago | parent | next [-] | | > None of their products are competitive We must live in different universes, then. Intel's 140V competes with and often outperforms AMD's 890M, at around half the power consumption.[1] Intel's B580 competes with AMD's RX 7600 and NVIDIA's RTX 4060, at a fraction of the price of the 4060.[2] They're not doing so well with desktop and laptop CPUs, although their Lunar Lake and Arrow Lake CPUs are still decent performers within their segments. The upcoming Panther Lake architecture is promising to improve this. If these are not the signs of competitive products, and that they're far from "circling the drain", then I don't know what is. FWIW, I'm not familiar with the health of their business, and what it takes to produce these products. But from a consumer's standpoint, Intel hasn't been this strong since... the early 00s? [1]: https://www.notebookcheck.net/Radeon-890M-vs-Arc-140V_12524_... [2]: https://www.notebookcheck.net/Intel-Arc-B580-Benchmarks-and-... | | |
| ▲ | fluoridation 6 days ago | parent [-] | | No way, man. Peak consumer Intel was from Core 2 up to Skylake-ish. That was when they started coasting and handed the market to AMD. Right now they're losing market share to them on mobile, desktop, and server. If we ignore servers, most PCs have an AMD CPU inside. The GPUs might be competitive on price, but that's about it. It's pretty much a hardware open beta. | | |
| ▲ | imiric 6 days ago | parent [-] | | Ah, I was thinking of Core 2, but was off by a couple of years. Although "peak" consumer Intel was undeniably in the 90s. Like I said, Intel may not be market leader in some segments, but they certainly have very competitive products. The fact they've managed to penetrate the dGPU duopoly, while also making huge strides with their iGPUs, is remarkable on its own. They're not leaders on desktops and servers, but still have respectable offerings there. None of this points to a company that's struggling, but to a healthy market where the consumer benefits. News of two rivals collaborating like this is not positive for consumers. | | |
| ▲ | fluoridation 6 days ago | parent [-] | | The 90s were easy mode for semiconductor manufacturers because of Moore's law, and because cranking the clocks was relatively easy. After 2000 was when the really advanced microarchitectures started coming out. >a company that's struggling, but to a healthy market where the consumer benefits I would argue that the market is only marginally healthier than, say, 2018. Intel is absolutely struggling. The 13th and 14th generation were marred by degradation issues and the 15th generation is just "eh", with no real reason to pick it over Zen. The tables have simply flipped compared to seven years ago; AMD at least is not forcing consumers to change motherboards every two years. And Intel doesn't even seem to care too much that they're losing relevance. One thing they could do is enable ECC on consumer chips like AMD did for the entire Ryzen lineup, but instead they prefer to keep their shitty market segmentation. Granted, I don't think it would move too many units, but it would at least be a sign of good will to enthusiasts. |
|
|
| |
| ▲ | gregoryl 7 days ago | parent | prev [-] | | I have bad news about realtek networking... | | |
| |
| ▲ | pengaru 7 days ago | parent | prev [-] | | When your own most competitive products are being made by your competitor for you, while you still have the cost center of running your own production fabs incapable of producing your most competitive products, and receiving bailouts just to keep the lights on... Some would say that's circling the drain. |
|
|
|
| ▲ | Retric 7 days ago | parent | prev | next [-] |
| Mergers where one company is on the verge of failing can be a net positive for consumers. Most obviously this happens when banks fail and people’s bank cards still work etc and at least initially the branches stay open. Intel isn’t at that point, but the companies trajectory isn’t looking good. I’d happily sacrifice ARC to keep a duopoly in CPU’s. |
|
| ▲ | mschuster91 7 days ago | parent | prev | next [-] |
| > This news muddies this approach, and I see it as a misstep for both Intel and for consumers. Consumers still have AMD as an alternative for very decent and price attractive GPUs (and CPUs). |
| |
| ▲ | adrian_b 7 days ago | parent [-] | | Not everybody wants GPUs for games or for AI. AMD has always followed closely NVIDIA in crippling their cheap GPUs for any other applications. After many years of continuously decreasing performance of the "consumer" GPUs, only Intel has offered in the Battlemage GPUs FP64 performance comparable with what could be easily obtained 10 years ago, but no longer today. Therefore, if the Intel GPUs disappear, then the choices in GPUs will certainly become much more restricted than today. AMD has almost never attempted to compete with NVIDIA in features, but whenever NVIDIA dropped some feature, so did AMD. | | |
| ▲ | kbolino 6 days ago | parent [-] | | The only consumer GPUs ten years ago that offered decent FP64 performance were the GTX TITAN series. And they were beasts! It's a shame nothing quite like them exists anymore. But they were the highest of high-end cards, certainly not that common or cheap. | | |
| ▲ | adrian_b 6 days ago | parent [-] | | AMD Hawaii GPUs in their professional variant (FirePro), which were cheap, unlike the "datacenter" GPUs of today, and the more recent Radeon VII had much better FP64 performance per $ than GTX Titan. Moreover, there were claims that the memory errors on GTX Titan were quite frequent. On graphics applications memory errors seldom matter, but if you have to do a computation twice to be certain that there were no memory errors affecting the results, that removes much of the performance advantage of a GPU. | | |
| ▲ | kbolino 6 days ago | parent | next [-] | | Fair enough. I did not know about these. It's hard to find reliable MSRP for them today, though. Given the era, market segment, and the competition, I'd estimate $1500-2000. It's not clear to me they were on consumer store shelves, either, whereas the GTX Titan was. A cheap GPU ten-plus years ago was $200-300. That GPU either had no FP64 units at all, or had them "crippled" just like today. What happened between then and now is that the $1k+ market segment became the $10k+ market segment (and the $200+ market segment became the $500+ market segment). That sucks, and nVidia and AMD are absolutely milking their customers for all they're worth, but nothing really got newly "crippled" along the way. | |
| ▲ | 6 days ago | parent | prev [-] | | [deleted] |
|
|
|
|
|
| ▲ | Spinnaker_ 6 days ago | parent | prev [-] |
| The consumer gpu market is a rounding error compared to enterprise AI. And Intel is zero threat to Nvidia there. |