| ▲ | aurareturn 6 days ago |
| It has A19 Pro. A19 Pro has matmul acceleration in its GPU, the equivalent of Nvidia's Tensor cores. This would make future Macs extremely viable for local LLMs. Currently, Macs have high memory bandwidth and high VRAM capacity but low prompt processing speeds. Give it a large context and it'll take forever before the first token is generated. If the M5 generation gets this GPU upgrade, which I don't see why not, then the era of viable local LLM inferencing is upon us. That's the most exciting thing from this Apple's event in my opinion. PS. I also like the idea of the ultra thin iPhone Air, the 2x better noise cancellation and live translation of Airpods 3, high blood pressure detection of the new Watch, and the bold sexy orange color of the iPhone 17 Pro. Overall, this is as good as it gets for incremental updates in Apple's ecosystem in a while. |
|
| ▲ | vasco 6 days ago | parent | next [-] |
| > bold sexy orange color Luckily they added the blood pressure check for when you get too excited about the color orange. |
| |
| ▲ | formerly_proven 6 days ago | parent | next [-] | | It is almost strange, since iPhones were only available in ugly drab colors for several generations. And the Pro models in particular were previously never available in a decent color. | | |
| ▲ | solids 6 days ago | parent | next [-] | | 99% of people uses a case for the phone so the color doesn’t change anything | | |
| ▲ | dzdt 5 days ago | parent | next [-] | | If this was true wouldn't there be a market for a ruggedized version that has the toughness of a case, from the factory as shipped? Its a little silly for Apply to shave every possible half-millimeter from the design and then have 99% of people add back the thickness plus a lot of extra by adding a case. Why not have a factory-ruggedized version which isn't as thick as adding that case but is just as rugged? | | |
| ▲ | cardanome 5 days ago | parent | next [-] | | Considering the price and re-sale value of iphones I would add a case even if they ruggedized it. My current (Android) phone is from 2020 and I have bought three cases for it because the previous ones got wear and tear. The phone inside still looks brand new. But yeah, the trend of ultra-thin phones is silly. | |
| ▲ | shinycode 5 days ago | parent | prev [-] | | That open a new business for them to sell $60 cases that are worth $2 of materials and have a great panel to match people’s taste which is even more appealing to buy |
| |
| ▲ | GTP 5 days ago | parent | prev | next [-] | | If you like the color, you can use a transparent case. | | | |
| ▲ | m-s-y 5 days ago | parent | prev | next [-] | | Transparent cases are a thing dontchaknow! | |
| ▲ | alwillis 3 days ago | parent | prev | next [-] | | I bought a blue iPhone 16 last year at this time; I've never used a case. More people are going case-less these days. | |
| ▲ | op7 5 days ago | parent | prev [-] | | Ask workers of cell phone stores and you’ll find that figure is way off. Not everyone wants a case. Having a case significantly changes the feeling of the device in hand. | | |
| ▲ | chadrs 5 days ago | parent [-] | | As someone who does not use a case, I almost never see anyone else without one. To the point that when I do, I usually mistake their phone for mine. |
|
| |
| ▲ | steve_adams_86 6 days ago | parent | prev | next [-] | | The 15s and 16s both had titanium bodies which (as I recall at least) don't take on colour as well when they're anodized, so that could be the cause of drab colour ways. edit: It was only the Pros and up which had titanium bodies. The 17s are all aluminum. | | |
| ▲ | mauvehaus 5 days ago | parent | next [-] | | Anodizing titanium creates an oxide layer, the thickness of which varies with the voltage used. The thickness of that oxide layer determines which wavelengths of light it refracts [0]. In practical terms, your choices for color are pretty limited[1]. I'm not a chemist, but I looked into this years back when I was wondering why everything titanium is offered in the same couple of colors. Personally, I like the plain gray. [0] https://wisensemachining.com/titanium-anodizing-guide/ [1] https://www.snowpeak.com/collections/cups/products/ti-single... | | |
| ▲ | steve_adams_86 5 days ago | parent | next [-] | | Hey, cool! This is very interesting and explains a lot. Thanks for connecting more dots for me > I looked into this years back when I was wondering why everything titanium is offered in the same couple of colors I wondered the same thing, but never hit that threshold of urgency to actually look into it! | |
| ▲ | svitlak 3 days ago | parent | prev [-] | | this is so cool! thanks for sharing! 18 pro rainbow phone anyone? |
| |
| ▲ | formerly_proven 6 days ago | parent | prev [-] | | But even the non-Pro phones had mostly ugly colors in the last couple years. Maybe to match the ugliness of the Pro models? | | |
| ▲ | jachee 6 days ago | parent [-] | | A realization that I came to today is the fact that I’m still on my 12 because it’s (one of the?) last PRODUCT(RED) one. The Sage Air has my eye. Would match my AirPods Max. But that orange pro is also calling. |
|
| |
| ▲ | giancarlostoro 5 days ago | parent | prev | next [-] | | I don't get why Apple doesn't do consistent colors. I loved the blue of my iPhone 12 Pro, but I can't even get that anymore... I would have upgraded a few generatiosn back if they had kept consistent colors. | | |
| ▲ | selectodude 5 days ago | parent | next [-] | | The changed colors are the signal that you have the new one. | | | |
| ▲ | sandworm101 5 days ago | parent | prev | next [-] | | And i have no idea what color my phone was when i got it. It has been inside an otterbox case since the hour i first had it. For me, the color of a cellphone is about as relevent as the color of a motherboard. It will look cool for, at most, a few minutes before it is forever locked inside a case. | |
| ▲ | alwillis 3 days ago | parent | prev | next [-] | | > I don't get why Apple doesn't do consistent colors The materials used is a factor: the last few iPhones were built with aluminum (iPhone 16), titanium (iPhone 16 Pro) and stainless steel (iPhone 13 Pro). Not all colors work with all materials; my understanding is titanium is particularly bad for bright colors. The colors for the iPhone Pro models have been pretty drab--not this year. | |
| ▲ | nsxwolf 5 days ago | parent | prev [-] | | I have that one, I sometimes take the case off just to look at it. |
| |
| ▲ | linhns 5 days ago | parent | prev [-] | | But somehow the trademark silver is disappearing. |
| |
| ▲ | bobmcnamara 5 days ago | parent | prev [-] | | BondiBlue4lyfe |
|
|
| ▲ | SirMaster an hour ago | parent | prev | next [-] |
| The live translation is software. It works on the AirPods Pro 2 and the AirPods 4 with AND. So is the high blood pressure detection. It's not from the new watch, it works also in the series 10 and series 9 watches. |
|
| ▲ | astrange 6 days ago | parent | prev | next [-] |
| A19 supports MTE: https://news.ycombinator.com/item?id=45186265 Which is a very powerful feature for anyone who likes security or finding bugs in their code. Or other people's code. Even if you didn't really want to find them. |
| |
| ▲ | rising-sky 6 days ago | parent | next [-] | | MIE | | |
| ▲ | philodeon 6 days ago | parent [-] | | MIE is a combination of enhanced MTE (EMTE) and some highly-overdue software allocator improvements. | | |
| ▲ | tucnak 5 days ago | parent | next [-] | | It certainly took them a while to introduce MTE! Pixel 8 came out in 2023. I wonder how it compares against hardened_malloc with 48-bit address space and 33-bit ASLR in Graphene. Apple's security team has reported that MIE could break all "known" exploit chains, but so does hardened_malloc. Hard to tell right now which one is best (most def MIE) but everything else included in Graphene is probably making the point moot anyway. | |
| ▲ | rising-sky 5 days ago | parent | prev [-] | | Yes, but it is not MTE, they are technically different. That's what I was attempting to point out but thought it may have been a typo |
|
| |
| ▲ | baybal2 6 days ago | parent | prev [-] | | [dead] |
|
|
| ▲ | mgerdts 6 days ago | parent | prev | next [-] |
| If you compare the specs of the 10 and 11 series watches you will see they both claim high blood pressure detection. https://www.apple.com/watch/compare/?modelList=watch-series-... In the past few weeks the oxymeter feature was enabled by a firmware update on series 10. Measurements are done on the watch, results are only reported on a phone. |
| |
| ▲ | SirMaster an hour ago | parent | next [-] | | Also works on the Series 9. | |
| ▲ | sgustard 6 days ago | parent | prev | next [-] | | Good to know! The fine print: As of September 9, 2025, hypertension notifications are currently under FDA review and expected to be cleared this month, with availability on Apple Watch Series 9 and later and Apple Watch Ultra 2 and later. The feature is not intended for use by people under 22 years old, those who have been previously diagnosed with hypertension, or pregnant persons. | | |
| ▲ | tartrate 6 days ago | parent [-] | | > [hypertension notifications] is not intended for use by people [...] who have been previously diagnosed with hypertension Sounds a bit ironic but I guess it's for legal reasons. | | |
| ▲ | SirMaster an hour ago | parent | next [-] | | It's the same for the Afib detection. The point of the detection features are to notify people who are not diagnosed with the condition to go to the doctor about it possibly get diagnosed with it. It's not useful for people already diagnosed because they already know they have it, so the notification is just telling them something they already know. | |
| ▲ | pixiemaster 6 days ago | parent | prev | next [-] | | legal, and also: if you already have been diagnosed, you should already be under medical professional supervision (meds, checkups,…) anyway. my guess is this is more like the heart irregularities feature: it’s for the first diagnosis. (a relative of mine actually got diagnosed that way) | |
| ▲ | kalap_ur 5 days ago | parent | prev [-] | | I believe this is for the fringe cases where you have been diagnosed with hypertension, but your apple Watch does not tell you that you have hypertension risk, then you may decide to not take your drugs, since your watch told you all clear. This could trigger lawsuits if complications set in when you decide not to take your drugs because of "lack of alarm" | | |
| ▲ | jacquesm 5 days ago | parent [-] | | Then you get a new fringe case: you are not yet diagnosed with hypertension, but you are aware that your apple watch has that functionality so you decide you don't need to be diagnosed. |
|
|
| |
| ▲ | 6 days ago | parent | prev | next [-] | | [deleted] | |
| ▲ | zimpenfish 6 days ago | parent | prev [-] | | Going to be interesting comparing the series 10 blood pressure sensing against my Hilo (formerly Aktiia) band on the other wrist. Although without calibration against a cuff, I'm not super convinced the Apple Watch will give reliable information. |
|
|
| ▲ | zumu 6 days ago | parent | prev | next [-] |
| > the bold sexy orange color of the iPhone 17 Pro The color line up reminds me of the au MEDIA SKIN phones (Japanese carrier) circa 2007. Maybe it's because I had one back in the day, but I can't help but think they took some influence. |
| |
|
| ▲ | babl-yc 6 days ago | parent | prev | next [-] |
| I've always been a bit confused about when to run models on the GPU vs the neural engine. The best I can tell, GPU is simpler to use as a developer especially when shipping a cross platform app. But an optimized neural engine model can run lower power. With the addition of NPUs to the GPU, this story gets even more confusing... |
| |
| ▲ | avianlyric 6 days ago | parent [-] | | In reality you don’t much of a choice. Most of the APIs Apple exposes for running neural nets don’t let you pick. Instead some Apple magic in one of their frameworks decides where it’s going to host your network. At least from what I’ve read, these frameworks will usually distribute your networks over all available matmul compute, starting on the neural net (assuming your specific network is compatible) and spilling onto the GPU as needed. But there isn’t a trivial way to specifically target the neural engine. | | |
| ▲ | babl-yc 6 days ago | parent [-] | | You're right there is no way to specifically target the neural engine. You have to use it via CoreML which abstracts away the execution. If you use Metal / GPU compute shaders it's going to run exclusively on GPU. Some inference libraries like TensorFlow/LiteRT with backend = .gpu use this. | | |
| ▲ | scosman 5 days ago | parent [-] | | Exactly. And most folks are using a framework like llama.cpp which does control where it’s run. |
|
|
|
|
| ▲ | commandersaki 6 days ago | parent | prev | next [-] |
| Hoping this budget macbook rumour based on A19/A19 Pro is real. |
| |
| ▲ | cj 5 days ago | parent [-] | | Isn’t the MacBook Air already pretty cheap at $999? | | |
| ▲ | toxican 5 days ago | parent | next [-] | | $998, $997, etc. | | | |
| ▲ | commandersaki 5 days ago | parent | prev | next [-] | | Competing at high end budget PC laptop will increase market share, but more importantly it’d be an easy recommendation to make when people see a lower priced PC laptop. Rumours were $699. | |
| ▲ | alwillis 3 days ago | parent | prev | next [-] | | Sure but the one he's referring to is rumored to cost $599-$699. | |
| ▲ | triceratops 5 days ago | parent | prev [-] | | Considering inflation $999 is cheaper. |
|
|
|
| ▲ | sercand 6 days ago | parent | prev | next [-] |
| Where did you see the matmul acceleration support? I couldn't find this detail online. |
| |
| ▲ | aurareturn 6 days ago | parent [-] | | Apple calls it "Neural Accelerators". It's all over their A19 marketing. | | |
| ▲ | kridsdale3 6 days ago | parent | next [-] | | What a ridiculous way to market "linear algebra transistor array". | | |
| ▲ | jacquesm 6 days ago | parent | next [-] | | Hey man, it helps you think different. You just never knew your neurons needed accelerating. | | |
| ▲ | kridsdale1 6 days ago | parent [-] | | I accelerate them every morning with an Americano. | | |
| ▲ | liamwire 6 days ago | parent [-] | | I have to ask out of curiosity, why is your first comment made with one account, and the reply with a similarly-named alt? | | |
| ▲ | kmarc 6 days ago | parent [-] | | To confuse all those neural accelerators scraping this conversation. | | |
| ▲ | liamwire 6 days ago | parent [-] | | That seems incredibly prescient for accounts created before even GPT-1. Obviously broad data scraping existed before then, but even amongst this crowd I find it hard to believe that’s the real motivator. | | |
|
|
|
| |
| ▲ | butlike 5 days ago | parent | prev | next [-] | | I really hope someone got fired for this blunder | |
| ▲ | jimbokun 5 days ago | parent | prev | next [-] | | Which means what, exactly, to someone whose not a machine learning researcher? | |
| ▲ | 6 days ago | parent | prev [-] | | [deleted] |
| |
| ▲ | kamranjon 6 days ago | parent | prev | next [-] | | Don’t all of the M series chips contain neural cores? | | |
| ▲ | aurareturn 6 days ago | parent [-] | | Yes, they do. They're called Neural Engine, aka NPUs. They aren't being used for local LLMs on Macs because they are optimized for power efficiency running much smaller AI models. Meanwhile, the GPU is powerful enough for LLMs but has been lacking matrix multiplication acceleration. This changes that. | | |
| ▲ | astrange 6 days ago | parent | next [-] | | The neural engine is used for the built-in LLM that does text summaries etc., just not third party LLMs. And there's an official port of Stable Diffusion to it: https://github.com/apple/ml-stable-diffusion | |
| ▲ | mrheosuper 6 days ago | parent | prev | next [-] | | I thought 1 of the reason we do ML on GPU is fast Matrix multiplication ? So the new engine is accelerator for matmul accelerator ? | | |
| ▲ | wtallis 6 days ago | parent [-] | | From a compute perspective, GPUs are mostly about fast vector arithmetic, with which you can implement decently fast matrix multiplication. But starting with NVIDIA's Volta architecture at the end of 2017, GPUs have been gaining dedicated hardware units for matrix multiplication. The main purpose of augmenting GPU architectures with matrix multiplication hardware is for machine learning. They aren't directly useful for 3D graphics rendering, but their inclusion in consumer GPUs has been justified by adding ML-based post-processing and upscaling like NVIDIA's various iterations of DLSS. |
| |
| ▲ | cchance 6 days ago | parent | prev [-] | | These are different these are built into the GPU Cores |
|
| |
| ▲ | emchammer 6 days ago | parent | prev [-] | | Does this mean that equivalent logic for what has been called Neural Engine is now integrated into each CPU core? | | |
| ▲ | rmccue 6 days ago | parent [-] | | Each GPU core, but yes, this was part of what they announced today - it’s now integral rather than separate. |
|
|
|
|
| ▲ | whyenot 6 days ago | parent | prev | next [-] |
| I wish they would offer the 17 pro in some lighter colors (like the new sage green for the regular 17). Not everyone wants bold, and the color selection for pro is always so limited. They don't even have white with this generation, just silver. |
|
| ▲ | Nokinside 6 days ago | parent | prev | next [-] |
| The first SoC including Neural Engine was the A11 Bionic, used in iPhone 8, 8 Plus and iPhone X, introduced in 2017. Since then, every Apple A-series SoC has included a Neural Engine. |
| |
| ▲ | aurareturn 6 days ago | parent | next [-] | | The Neural Engine is its own block. Neural Engine is not used for local LLMs on Macs. Neural Engine is optimized for power efficiency while running small models. It's not good for LARGE language models. This change is strictly adding matmul acceleration into each GPU core where it is being used for LLMs. | | | |
| ▲ | runjake 6 days ago | parent | prev [-] | | The matmul stuff is part of the Neural Accelerator marketing, which is distinct from the Neural Engine you're talking about. I don't blame you. It's confusing. | | |
| ▲ | Nokinside 6 days ago | parent [-] | | It's remaining and rearrangement of the same stuff. Not a new feature. | | |
| ▲ | aurareturn 6 days ago | parent | next [-] | | The NPU is still there. This adds matmul acceleration directly into each GPU core. It takes about ~10% more transistors to add these accelerators into the GPU so it's a significant investment for Apple. | |
| ▲ | runjake 6 days ago | parent | prev [-] | | 1. It adds new features. Eg. see matmul and other to-be-detailed-soon features. 2. It moves some stuff from the external Neural Engine to the GPU, which substantially increases speeds for those workloads. That itself is a feature. Will any of this really matter much to the average consumer at this point? Probably not. Not until Apple Intelligence gets off the ground. |
|
|
|
|
| ▲ | atcon 5 days ago | parent | prev | next [-] |
| Viable may be already here: demo of smollm3/3b <https://news.ycombinator.com/item?id=44501413> on iphone with asr + tts: <https://x.com/adrgrondin/status/1965097304995889642> Intrigued to explore with a19/m5 and test energy efficiency. |
|
| ▲ | AdventureMouse 6 days ago | parent | prev | next [-] |
| > If the M5 generation gets this GPU upgrade, which I don't see why not, then the era of viable local LLM inferencing is upon us. I don't think local LLMs will ever be a thing except for very specific use cases. Servers will always have way more compute power than edge nodes. As server power increases, people will expect more and more of the LLMs and edge node compute will stay irrelevant since their relative power will stay the same. |
| |
| ▲ | seanmcdirmid 6 days ago | parent | next [-] | | LocalLLMs would be useful for low latency local language processing/home control, assuming they ever become fast enough where the 500ms to 1s network latency becomes a dominate factor in having a fluid conversation with a voice assistant. Right now the pauses are unbearable for anything but one way commands (Siri, do something! - 3 seconds later it starts doing the thing...that works but it wouldn't work if Siri needed to ask follow up questions). This is even more important if we consider low latency gaming situations. Mobile applications are also relevant. An LLM in your car could be used for local intelligence. I'm pretty sure self driving cars use some about of local AI already (although obviously not LLM, and I don't really know how much of their processing is local vs done on a server somewhere). If models stop advancing at a fast clip, hardware will eventually become fast and cheap enough that running models locally isn't something we think about as being a non-sensical luxury, in the same way that we don't think that rendering graphics locally is a luxury even though remote rendering is possible. | | |
| ▲ | dgacmu 5 days ago | parent [-] | | Network latency in most situations is not 500ms. The latency from New York California is under 70ms, and if you add in some transmission time you're still under 200ms. And that's ignoring that an NYC request will probably go only to VA (sub-15ms). Even over LTE you're looking at under 120ms coast to coast. | | |
| ▲ | seanmcdirmid 5 days ago | parent [-] | | You have to take any of those numbers and multiply them by two, since you have to go there and then back again. | | |
|
| |
| ▲ | jameshart 6 days ago | parent | prev | next [-] | | > Servers will always have way more compute power than edge nodes This doesn't seem right to me. You take all the memory and CPU cycles of all the clients connected to a typical online service, compared to the memory and CPU in the datacenter serving it? The vast majority of compute involved in delivering that experience is on the client. And there's probably vast amounts of untapped compute available on that client - most websites only peg the client CPU by accident because they triggered an infinite loop in an ad bidding war; imagine what they could do if they actually used that compute power on purpose. But even doing fairly trivial stuff, a typical browser tab is using hundreds of megs of memory and an appreciable percentage of the CPU of the machine it's loaded on, for the duration of the time it's being interacted with. Meanwhile, serving that content out to the browser took milliseconds, and was done at the same time as the server was handling thousands of other requests. Edge compute scales with the amount of users who are using your service: each of them brings along their own hardware. Server compute has to scale at your expense. Now, LLMs bring their special needs - large models that need to be loaded into vast fast memory... there are reasons to bring the compute to the model. But it's definitely not trivially the case that there's more compute in servers than clients. | | |
| ▲ | arghwhat 6 days ago | parent [-] | | The sum of all edge nodes exceed the power in the datacenter, but the peak power provided to you from the datacenter significantly exceed your edge node capabilities. A single datacenter machine with state of the art GPUs serving LLM inference can be drawing in the tens of kilowatts, and you borrow a sizable portion for a moment when you run a prompt on the heavier models. A phone that has to count individual watts, or a laptop that peaks on dual digit sustained draw, isn't remotely comparable, and the gap isn't one or two hardware features. |
| |
| ▲ | pdpi 6 days ago | parent | prev | next [-] | | As an industry, we've swung from thin clients to fat clients and back countless times. I'm sure LLMs won't be immune to that phenomenon. | | |
| ▲ | meltyness 6 days ago | parent [-] | | I adore this machinery, there's a lot of money riding on the idea that interest in AI/ML will result in the value being in owning bunch of big central metal like cloud era has produced, but I'm not so sure. | | |
| ▲ | SturgeonsLaw 6 days ago | parent [-] | | I'm sure the people placing multibillion dollar bets have done their research, but the trends I see are AI getting more efficient and hardware getting more powerful, so as time goes on, it'll be more and more viable to run AI locally. Even with token consumption increasing as AI abilities increase, there will be a point where AI output is good enough for most people. Granted, people are very willing to hand over their data and often money to rent a software licence from the big players, but if they're all charging subscription fees where a local LLM costs nothing, that might cause a few sleepless nights for a few execs. | | |
| ▲ | meltyness 6 days ago | parent | next [-] | | tts would be an interesting case-study. it hasn't really been in the lime-light, so could serve as a leading indicator for what will happen when attention to text generation inevitably wanes I use Read Aloud across a few browser platforms cause sometimes I don't care to read an article I have some passing interest in. The landscape is a mess: it's not really bandwidth efficient to transmit on one count, local frameworks like Piper perform well in alot of cases, there's paid APIs from the big players, at least one player has incorporated api-powered neural tts and packaged it into their browser presumably ad-supported or something, yet another has incorporated into their OS, already (though it defaults to speak and spell for god knows why). I'm not willing to pay $0.20 per page though, after experimenting, especially when the free/private solution is good enough. | |
| ▲ | impure-aqua 6 days ago | parent | prev [-] | | We could potentially see one-time-purchase model checkpoints, where users pay to get a particular version for offline use, and future development is gated behind paying again- but certainly the issue of “some level of AI is good enough for most users” might hurt the infinite growth dreams of VCs |
|
|
| |
| ▲ | Closi 5 days ago | parent | prev | next [-] | | IMO the benefit of a local LLM on a smartphone isn't necessarily compute power/speed - it's reliability without a reliance on connectivity, it can offer privacy guarantees, and assuming the silicon cost is marginal, could mean you can offer permanent LLM capabilities without needing to offer some sort of cloud subscription. | |
| ▲ | hapticmonkey 6 days ago | parent | prev | next [-] | | If the future is AI, then a future where every compute has to pass through one of a handful of multinational corporations with GPU farms...is something to be wary of. Local LLMs is a great idea for smaller tasks. | | |
| ▲ | tonyhart7 6 days ago | parent [-] | | but its not the future, we already can do that right now the problem is people expectation, they want the model to be smart people aren't having problem for if its local or not, but they want the model to be useful | | |
| ▲ | aurareturn 5 days ago | parent [-] | | Sure, that's why local LLMs aren't popular or mass market as of September 2025. But cloud models will have diminishing returns, local hardware will get drastically faster, and techniques to efficiently inference them will be worked out further. At some point, local LLMs will have its day. | | |
| ▲ | tonyhart7 5 days ago | parent [-] | | only in theory and that's not gonna be happening this is the same happening with software and game industry because free market forces people to raise the bar every year, the requirement of apps and games never met. its only goes up human would never be satisfied, boundary would be push further that's why we have 12gb or 16gb ram for smartphone right now only for system + apps and now we must accommodate for local LLM too??? it would only goes up, people would demand smarter and smarter model frontier model today would deem unusable(dumb) in 5 years example: people literally screaming in agony when Antrophic quantized their model |
|
|
| |
| ▲ | Nevermark 6 days ago | parent | prev | next [-] | | Boom! [0] > Deepseek-r1 was loaded and ran locally on the Mac Studio > M3 Ultra chip [...] 32-core CPU, an 80-core GPU, and the 32-core Neural Engine. [...] 512GB of unified memory, [...] memory bandwidth of 819GB/s. > Deepseek-r1 was loaded [...] 671-billion-parameter model requiring [...] a bit less than 450 gigabytes of [unified] RAM to function. > the Mac Studio was able to churn through queries at approximately 17 to 18 tokens per second > it was observed as requiring 160 to 180 Watts during use Considering getting this model. Looking into the future, a Mac Studio M5 Ultra should be something special. [0] https://appleinsider.com/articles/25/03/18/heavily-upgraded-... | | |
| ▲ | bigyabai 5 days ago | parent [-] | | "Maybe Apple will disprove you in the future" isn't a great refutation of the parent's point. | | |
| ▲ | evilduck 4 days ago | parent [-] | | "Servers are more powerful" isn't a super strong point. Why aren't all PC gamers rendering games on servers if raw power was all that mattered? Why do workstation PCs even exist? Society is already giving pushback to AI being pushed on them everywhere; see the rise of the word "clanker". We're seeing mental health issues pop up. We're all tired of AI slop content and engagement bait. Even the developers like us discussing it at the bleeding edge go round in circles with the same talking points reflexively. I don't see it as a given that there's public demand for even more AI, "if only it were more powerful on a server". | | |
| ▲ | bigyabai 4 days ago | parent [-] | | You make a good point, but you're still not refuting the original argument. The demand for high-power AI still exists, the products that Apple sells today do not even come close to meaningfully replacing that demand. If you own an iPhone, you're probably still using ChatGPT. Speaking to your PC gaming analogy, there are render farms for graphics - they're just used for CGI and non-realtime use cases. What there isn't a huge demand for is consumer-grade hardware at datacenter prices. Apple found this out the hard way shipping Xserve prematurely. | | |
| ▲ | evilduck 3 days ago | parent | next [-] | | > Speaking to your PC gaming analogy, there are render farms for graphics - they're just used for CGI and non-realtime use cases. What there isn't a huge demand for is consumer-grade hardware at datacenter prices. Right, and that's despite the datacenter hardware being far more powerful and for most people cheaper to use per hour than the TCO of owning your own gaming rig. People still want to own their computer and want to eliminate network connectivity and latency being a factor even when it's generally a worse value prop. You don't see any potential parallels here with local vs hosted AI? Local models on consumer grade hardware far inferior to buildings full of GPUs can already competently do tool calling. They can already generate tok/sec far beyond reading speed. The hardware isn't serving 100s of requests in parallel. Again, it just doesn't seem far fetched to think that the public will sway away from paying for more subscription services for something that can basically run on what they already own. Hosted frontier models won't go away, they _are_ better at most things, but can all of these companies sustain themselves as businesses if they can't keep encroaching into new areas to seek rent? For the average ChatGPT user, local Apple Intelligence and Gemma 3n basically already have the skills and smarts required, they just need more VRAM, and access to RAG'd world knowledge and access to the network to keep up. | |
| ▲ | pdimitar 3 days ago | parent | prev [-] | | > The demand for high-power AI still exists, the products that Apple sells today do not even come close to meaningfully replacing that demand. Correct, though to me it seems that this comes at the price of narrowing the target audience (i.e. devs and very high-demanding analysis + production work). For almost everything else people just open a bookmarked ChatGPT / Gemini link and let it flow, no matter how erroneous it might be. The AI area is burning a lot of bridges and has done so for the last 1.5 - 2.0 years; they solidify the public's idea that they only peddle subscription income as hard as they can without providing more value. Somebody finally had the right idea some months ago: sub-agents. Took them a while, and it was obvious right from the start that just dumping 50 pages on your favorite LLM is never going to produce impressive results. I mean, sometimes it does but people do a really bad job at quickly detecting when it does not, and are slow to correct course and just burn through tokens and their own patience. Investors are gonna keep investor-ing, they will of course want the paywall and for there to be no open models at all. But happily the market and even general public perception are pushing back. I am really curious what will come out of all this. One prediction is local LLMs that secretly transmit to the mothership, so the work of the AI startup is partially offloaded to its users. But I am known to be very cynical, so take this with a spoonful of salt. |
|
|
|
| |
| ▲ | waterTanuki 6 days ago | parent | prev | next [-] | | I regularly use local LLMs at work (full stack dev) due to restrictions and occasionally I get some results comparable to gpt-5 or opus 4 | | |
| ▲ | eprparadox 6 days ago | parent [-] | | this is really cool. could you say a bit about your setup (which llms, what tasks they’re best for, etc)? | | |
| ▲ | waterTanuki 5 days ago | parent [-] | | I switch between gpt-oss:20b/qwen3:30b. Good for green fielding projects, setting up bash scripts, simple CRUD apis using express, and the occasional error in a React or Vue app. |
|
| |
| ▲ | rowanG077 6 days ago | parent | prev | next [-] | | That's assuming diminishing returns won't hit hard. If a 10x smaller local model is 95%(Whatever that means) as good as the remote model it makes sense to use local models most of the time. It remains to be seen if that will happen but it's certainly not unthinkable imp. | | |
| ▲ | sigmar 6 days ago | parent [-] | | It's really task-dependent, text summarization and grammar corrections are fine with local models. I posit any tasks that are 'arms race-y' (image generation, creative text generation) are going to be offloaded to servers, as there's no 'good enough' bar above which they can't improve. |
| |
| ▲ | PaulRobinson 5 days ago | parent | prev | next [-] | | Apple literally mentioned local LLMs in the event video where they announced this phone and others. Apple's privacy stance is to do as much as possible on the user's device and as little as possible in cloud. They have iCloud for storage to make inter-device synch easy, but even that is painful for them. They hate cloud. This is the direction they've had for some years now. It always makes me smile that so many commentators just can't understand it and insist that they're "so far behind" on AI. All the recent academic literature suggests that LLM capability is beginning to plateau, and we don't have ideas on what to do next (and no, we can't ask the LLMs). As you get more capable SLMs or LLMs, and the hardware gets better and better (who _really_ wants to be long on nVIDIA or Intel right now? Hmm?), people are going to find that they're "good enough" for a range of tasks, and Apple's customer demographic are going to be happy that's all happening on the device in their hand and not on a server [waves hands] "somewhere", in the cloud. | | |
| ▲ | astrange 5 days ago | parent [-] | | It's not difficult to find improvements to LLMs still. Large issues: tokenizers exist, reasoning models are still next-token-prediction instead of having "internal thoughts", RL post-training destroys model calibration Small issues: they're all trained to write Python instead of a good language, most of the benchmarks are bad, pretraining doesn't use document metadata (ie they have to learn from each document without being told the URL or that they're written by different people) |
| |
| ▲ | fennecfoxy 5 days ago | parent | prev | next [-] | | I think they will be, but more for hand-off. Local will be great for starting timers, adding things to calendar, moving files around. Basic, local tasks. But it also needs to be intelligent enough to know when to hand off to server-side model. Android crowd has been able to run LLMs on-device since LlamaCPP first came out. But the magic is in the integration with OS. As usual there will be hype around Apple, idk, inventing the very concept of LLMs or something. But the truth is neither Apple nor Android did this; only the wee team that wrote the attention is all you need paper + the many open source/hobbyist contributors inventing creative solutions like LoRA and creating natural ecosystems for them. That's why I find this memo so cool (and will once again repost the link): https://semianalysis.com/2023/05/04/google-we-have-no-moat-a... | |
| ▲ | brookst 6 days ago | parent | prev | next [-] | | Couldn’t you apply that same thinking to all compute? Servers will always have more, timesharing means lower cost, people will probably only ever own dumb terminals? | | |
| ▲ | aydyn 6 days ago | parent [-] | | Latency. You cant play video games on the cloud. Google tried and failed. | | |
| ▲ | wcarss 6 days ago | parent | next [-] | | well, another way to recount it is that google tried and it worked okay but they decided it wasn't moving the needle, so they stopped trying. | |
| ▲ | liamwire 6 days ago | parent | prev | next [-] | | Huh? GeForce NOW is a resounding success by many metrics. Anecdotally, I use it weekly to play multiplayer games and it’s an excellent experience. Google giving up on Stadia as a product says almost nothing about cloud gaming’s viability. | |
| ▲ | Balinares 6 days ago | parent | prev [-] | | Do you mean Stadia? Stadia worked great. The only perceptible latency I initially had ended up coming from my TV and was fixed by switching it to so-called "gaming mode". Never could figure out what the heck the value proposition was supposed to be though. Pay full price for a game that you can't even pretend you own? I don't think so. And the game conservation implications were also dire, so I'm not sad it went away in the end. But on technical merits? It worked great. | | |
|
| |
| ▲ | alwillis 3 days ago | parent | prev | next [-] | | > don't see why not, then the era of viable local LLM inferencing is upon us.
I don't think local LLMs will ever be a thing except for very specific use cases. I disagree. There's a lot of interest in local LLMs in the LLM community. My internet was down for a few days and did I wish I had a local LLM on my laptop! There's a big push for privacy; people are using LLMs for personal medical issues for example and don't want that going into the cloud. Is it necessary to talk to a server just to check out a letter I wrote? Obviously with Apple's release of iOS 26 and macOS 26 and the rest of their operating systems, tens of millions of devices are getting a local LLM with 3rd party apps that can take advantage of them. | |
| ▲ | MPSimmons 6 days ago | parent | prev | next [-] | | The crux is how big the L is in the local LLMs. Depending on what it's used for, you can actually get really good performance on topically trained models when leveraged for their specific purpose. | | |
| ▲ | rickdeckard 6 days ago | parent [-] | | There's alot of L's in LLLM, so overall it's hard to tell what you're trying to say... Is it 'Local'?, 'Large?'...'Language?' | | |
| ▲ | fennecfoxy 5 days ago | parent | next [-] | | Clearly the Large part, given the context...LLMs usually miss stuff like this, funnily enough. | |
| ▲ | touristtam 6 days ago | parent | prev | next [-] | | Do you see the C for Cheap in there? Me neither. | | |
| ▲ | rickdeckard 6 days ago | parent [-] | | Sorry I'm not following. Cheap in terms of what, hardware cost? From Apple's point of view a local model would be the cheapest possible to run, as the end-user pays for hardware plus consumption... |
| |
| ▲ | triceratops 5 days ago | parent | prev [-] | | Username checks out. |
|
| |
| ▲ | unethical_ban 6 days ago | parent | prev | next [-] | | It's a thing right now. I'm running Qwen 30B code on my framework laptop to ask questions about ruby vs. python syntax because I can, and because the internet was flaky. At some point, more doesn't mean I need it. LLMs will certainly get "good enough" and they'll be lower latency, no subscription, and no internet required. | | |
| ▲ | nsonha 6 days ago | parent [-] | | pretty amazing, as a student I remember downloading offline copies of Wikipedia and Stack Overflow and felt that I have the entire world truly in my laptop and phones. Local LLMs are arguably even more useful than those archives. |
| |
| ▲ | hotstickyballs 6 days ago | parent | prev | next [-] | | If compute power is the deciding factor server vs edge discussion then we’d never have smartphones. | |
| ▲ | nsonha 6 days ago | parent | prev [-] | | local LLM may not be good enough for answering questions (which I think won't be true really soon) or generating images, but it today should be good enough to infer deeplinks and app extension calls or agentic walk-through... and ushers a new era of controlling phone by voice command. | | |
|
|
| ▲ | chisleu 5 days ago | parent | prev | next [-] |
| Because of the prompt processing speed, small models like Qwen 3 coder 30b a3b are the sweet spot for mac platform right now. Which means a 32 or 64GB mac is all you need to use Cline or your favorite agent locally. |
| |
| ▲ | DrAwdeOccarim 5 days ago | parent [-] | | Yes, I use LM Studio daily with Qwen 3 30b a3b. I can't believe how good it is locally. | | |
| ▲ | paool 5 days ago | parent [-] | | Can you use your Qwen instance in CLIs like Claude code, codex, or whatever open source coding agent? Or do you have to copy paste into LM studio? | | |
| ▲ | evilduck 4 days ago | parent | next [-] | | Yeah you can, so long as you're hosting your local LLM through something with an OpenAI-compatible API (which is a given for almost all local servers at this point, including LM Studio). https://opencode.ai and https://github.com/QwenLM/qwen-code both allow you to configure any API as the LLM provider. That said, running agentic workloads on local LLMs will be a short and losing battle against context size if you don't have hardware specifically bought for this purpose. You can get it running and it will work for several autonomous actions but not nearly as long as a hosted frontier model will work. | | |
| ▲ | TrajansRow 4 days ago | parent [-] | | Unfortunately, IDE integration like this tends to be very prefill intensive (more math than memory). That puts Apple Silicon at a disadvantage without the feature that we’re talking about. Presumably the upcoming M5 will also have dedicated matmul acceleration in the GPU. This could potentially change everything in favor of local AI, particularly on mobile devices like laptops. | | |
| ▲ | evilduck 4 days ago | parent [-] | | Cline has a new "compact" prompt enabled for their LM Studio integration which greatly alleviates the long system prompt prefill problem, especially for Macs which suffer from low compute (though it disables MCP server usage, presumably the lost part of the prompt is what made that work well). It seems to work better for me when I tested it and Cline's supposedly adding it to the Ollama integration. I suspect that type of alternate local configuration will proliferate into the adjacent projects like Roo, Kilo, Continue, etc. Apple adding hardware to speed it up will be even better, the next time I buy a new computer. |
|
| |
| ▲ | DrAwdeOccarim 4 days ago | parent | prev [-] | | LM Studio lets you run a model as a local API (OpenAI-compatible REST server). |
|
|
|
|
| ▲ | supportengineer 6 days ago | parent | prev | next [-] |
| I was reminded of this today for no particular reason: "iPhone4 vs HTC Evo" https://www.youtube.com/watch?v=FL7yD-0pqZg |
|
| ▲ | ottah 5 days ago | parent | prev | next [-] |
| Nah, memory is still the bottleneck. Kernel performance is already pretty good, but cpu memory is still dramatically slower than gpu memory. |
|
| ▲ | 6 days ago | parent | prev | next [-] |
| [deleted] |
|
| ▲ | aagha 5 days ago | parent | prev | next [-] |
| Apple is playing 3D chess while every other PC maker is learning how to play checkers. |
|
| ▲ | bendoy 5 days ago | parent | prev | next [-] |
| I'm most excited about the heart rate sensor in Airpods Pro 3! |
|
| ▲ | amelius 5 days ago | parent | prev | next [-] |
| > It has A19 Pro. But it's not general purpose. Broken by design. I'll pass. Not going to support this. We need less of this crap not more. |
|
| ▲ | Uehreka 6 days ago | parent | prev | next [-] |
| I will believe this when I see it. It’s totally possible that those capabilities are locked behind some private API or that there’s some weedsy hardware complication not mentioned that makes them non-viable for what we want to do with them. |
| |
| ▲ | aurareturn 5 days ago | parent | next [-] | | Already available via Metal: https://x.com/liuliu/status/1932158994698932505 | | | |
| ▲ | llm_nerd 5 days ago | parent | prev [-] | | They might recommend using CoreML to leverage them, though I imagine it will be available to Metal. The whole point of CoreML is that your solution uses whatever hardware is available to you, including enlisting a heterogeneous set of units to conquer a large problem. Software written years ago would use the GPU matmul if deployed to a capable machine. |
|
|
| ▲ | ActorNightly 6 days ago | parent | prev | next [-] |
| Good luck actually getting access to ANE. There is a reason why Pytorch doesn't use it even if its been around for a while. |
| |
|
| ▲ | SilverElfin 6 days ago | parent | prev | next [-] |
| deleted |
| |
| ▲ | apparent 6 days ago | parent | next [-] | | According to this page, [1] it reduces unwanted noise 4x as much as the original AirPods Pro and 2x as much as the AirPods Pro 2. Though I do wonder, given the logarithmic nature of sound perception, are these numbers deceptive in terms of what the user will perceive? 1: https://www.apple.com/airpods-pro/ | |
| ▲ | WanderPanda 6 days ago | parent | prev [-] | | It was 4x over the original version IIRC so should be ~ 2x over the previous |
|
|
| ▲ | Aperocky 6 days ago | parent | prev | next [-] |
| So.. 6 hour batteries like the Apple Watch? |
| |
| ▲ | apparent 6 days ago | parent | next [-] | | According to Apple's comparison tool, the Air has 27 hrs of video playback, compared to 30 for the 17 and 39 for the Pro. Based on that, it doesn't sound like it's that much worse. Of course, if you're trying to maximize battery longevity by not exceeding 80% charge, that might make it not very useful for many people. | |
| ▲ | mbirth 6 days ago | parent | prev [-] | | But there’s this now: https://store.apple.com/uk/xc/product/MGPG4ZM/A | | |
| ▲ | wlesieutre 6 days ago | parent | next [-] | | Heck make the phone even thinner and sell it with the battery pack and we'll have reinvented phones with swappable batteries | | |
| ▲ | eloisant 6 days ago | parent | next [-] | | Except now your phone is getting energy wirelessly, which is less efficient and gets hot... A big loss for a small coolness factor. | |
| ▲ | talldan 6 days ago | parent | prev [-] | | The Moto Z was ahead of its time! (it was thinner and had a magnetic battery add-on). You did have to pay extra for the battery, mind. |
| |
| ▲ | stephenlf 6 days ago | parent | prev [-] | | It’s embarrassing how frequently they’ll yank out an important part and sell it as an add-on. | | |
| ▲ | mbirth 6 days ago | parent | next [-] | | Yes, and then don’t even make it compatible with other phones. I’m a big fan of their previous (discontinued) MagSafe battery as that supports reverse charging, charge state display on phone and has the perfect size. This new battery however is only compatible with the Air as other phones have a bigger camera bump. | | |
| ▲ | jq-r 6 days ago | parent | next [-] | | And how much better would be if it had a physical connector so it's much more efficient. So you would have a bigger total charge and it wouldn't cook both batteries in the process. One can dream though. | | | |
| ▲ | badc0ffee 6 days ago | parent | prev | next [-] | | Excuse me, that hump is the Iconic Plateau. | | | |
| ▲ | crazygringo 6 days ago | parent | prev | next [-] | | Different phones have different sizes and shapes. Not sure what you expect, for a product designed to match the phone's size exactly? And batteries don't last forever. When you upgrade to a new phone after a few years you'd likely need a new one anyways. Worst case scenario just sell the old one on eBay if it's still holding a good charge! | | |
| ▲ | mbirth 6 days ago | parent [-] | | The previous MagSafe battery has the size of the MagSafe wallet and thus fits onto all the iPhones that have MagSafe down to the "Mini" variants. It's the perfect emergency power backup. But Apple discontinued this a while ago. Selling a thin phone with half a battery where you have to buy the other half and keep it attached to get a proper battery runtime (turning it into a normal-sized phone) can't be the solution Apple intended. At least I hope so. And that battery doesn't fit other iPhones as the camera bump of those other phones is in the way. | | |
| ▲ | hombre_fatal 6 days ago | parent | next [-] | | Well, swappable batteries has the UX advantage of being able to swap in full charges. I don't really understand all the complaining since it's merely a variant of the iPhone for people who prioritize thinness over battery. For over a decade, HNers have complained that they don't want thinness to be forced on them and that there should be a separate SKU for it. Yet when it finally happens, HNers complain about the trade-off. | | |
| ▲ | bee_rider 6 days ago | parent | next [-] | | If the outcome is that non-air iPhones are allowed to get a little thicker now, that’d be super cool. | |
| ▲ | mathgeek 6 days ago | parent | prev [-] | | Consider that the folks complaining about one thing can be different groups from the ones complaining about another. | | |
| ▲ | petersellers 6 days ago | parent [-] | | Either way, people have options now. If one doesn't like the compromises of the thin phone, they can buy the thick phone. Seems silly to complain about the thinness if you're not the target demographic. |
|
| |
| ▲ | wlesieutre 6 days ago | parent | prev | next [-] | | Magsafe battery has also been a great fix for a 5-year old built-in battery for 1/3 the cost of a battery replacement, and when I finally replace this phone I'll still have the magsafe battery for travel/emergency. 3rd party versions of course, the official one was much more expensive than that. | |
| ▲ | chrisweekly 6 days ago | parent | prev [-] | | "half a battery" yielding 27h seems kinda harsh criticism |
|
| |
| ▲ | justinator 6 days ago | parent | prev [-] | | why wouldn't it be compatible with other phones? | | |
| ▲ | mbirth 6 days ago | parent [-] | | See this photo: https://store.storeimages.cdn-apple.com/1/as-images.apple.co... The camera bump on other models protrudes more towards the centre of the body. And thus the battery wouldn't fit (flush) and the Qi charging wouldn't engage properly. | | |
| ▲ | justinator 6 days ago | parent [-] | | So running the battery perpendicular to this photo isn't an option? I'm sorry if I'm not completely familiar with this product: you are to have this battery attached at all times while you're charging, and it just stays in place? (gawd I sound like I'm from a different planet, I apologize -- wireless charging just never has been interesting to me) | | |
| ▲ | altairprime 6 days ago | parent [-] | | Yes, the power icon on the back of the phone cases is a set of magnets designed to ensure rotation and x,y center magnetic lock. |
|
|
|
| |
| ▲ | wilg 6 days ago | parent | prev | next [-] | | What on earth are you talking about? It includes a battery and there is both a cheaper and a more expensive version that has more battery, plus an add on battery pack. And you’re complaining about what exactly? | |
| ▲ | Nevermark 6 days ago | parent | prev | next [-] | | You could get an iPhone Max as your iPhone Air backup. Or maybe just get the iPhone Max...? Seems like Apple is way ahead of you. | |
| ▲ | brookst 6 days ago | parent | prev | next [-] | | Some people: I don’t want to carry extra battery all the time for the one day a month I need it. Other people: how dare you | |
| ▲ | 6 days ago | parent | prev | next [-] | | [deleted] | |
| ▲ | bigiain 6 days ago | parent | prev [-] | | I, for one, am looking forward to being forced to purchase the add-on MagSafe headphone adaptor. (And the MagSafe floppy drive.) | | |
|
|
|
|
| ▲ | baby 6 days ago | parent | prev [-] |
| IMO it's underwhelming considering folding phones have been out for many years now and we still don't have a folding iPhone. What are the PMs doing at Apple. |
| |
| ▲ | ndiddy 6 days ago | parent | next [-] | | I think folding phones will remain a small niche unless someone figures out how to make a foldable screen that doesn't get permanently scratched by your fingernails. | | |
| ▲ | ugh123 6 days ago | parent | next [-] | | The "unless" argument is where apple has done well: - mobile mp3 player sales are low unless disk and battery life are greatly improved - large display touch screen phone market is small unless someone solves the "app problem" - smart watch market is tiny if exists at all unless someone makes one that is useful and has improved battery life | | |
| ▲ | kennywinker 6 days ago | parent | next [-] | | Pebble had like a week long battery life. Apple’s pitch wasn’t better battery life, it was just “that thing for nerds? This is the same, but for everyone else.” I.e. it came with seamless integration with your phone, rings to close, a more expensive look, and more polished fitness tracking. The breakthru that made touchscreen phones works wasn’t an app ecosystem. That came after people were already crazy about iPhones. It was capacitive touch screens. Basically everything before was resistive touch, which is why they usually had styluses. Getting touch, and really multi-touch, working well was the game changer that redefined cell phones. | |
| ▲ | blobbers 6 days ago | parent | prev [-] | | To be fair, Apple Watch battery life is atrocious compared to competing models. Their marketing and ecosystem is better. | | |
| ▲ | bentcorner 5 days ago | parent | next [-] | | IMO there's a gap between "charge every day" and "charge once a week" that needs to be crossed. In other words, if they made the battery last twice as long it'd still be equally as annoying (since your daily routine would be nearly the same, except now you also need to remember if it's a charge day or non-charge day). To be fair maybe 3/4 days buys you some convenience. But anyways charging once a day is a reasonable place to get to, to get something better would require at minimum a 3x improvement which probably means a ground-up rework instead of continuous refinement. A battery band might get you there but I suspect it'd be too clunky. At best Apple may redesign their watch to support a battery band and allow 3rd parties to make them for folks that need weeks of battery life. | | |
| ▲ | apparent 5 days ago | parent [-] | | For me, it comes down to two things. First, I do not want to have to charge every night since I use my watch as a silent vibrating alarm, and I track my sleep. It seems like Apple has basically overcome this hurdle, now that you can charge while you shower and basically get by. The other issue is that I don't want to have to bring Yet Another Dongle™ every time I go away for a weekend or short business trip. Most of my trips are ≤ 4 days, so if AWs could reliably go that long (including battery degradation over time) then I'd consider getting one. Right now, only the AWU even approaches this, and only in low-power mode. If it weren't a thousand dollars, I'd consider it. But between the low-power requirement and the pricing, it's just no contest in my book. I'm getting a new Pebble, which offers a month of battery life at 1/3 of the cost. | | |
| ▲ | gf000 4 days ago | parent [-] | | > The other issue is that I don't want to have to bring Yet Another Dongle™ I think reverse charging from your smartphone is a quite decent solution to the problem, which is supported by certain Android devices. | | |
| ▲ | apparent 4 days ago | parent [-] | | If this were possible, it would definitely make a difference for me. |
|
|
| |
| ▲ | Nevermark 6 days ago | parent | prev | next [-] | | I am surprised Apple doesn't sell a battery band for people who want a weeks charge. | | |
| ▲ | blobbers 2 days ago | parent [-] | | That would be slick. Perhaps the problem is it would get hot, and possibly burn? |
| |
| ▲ | KoolKat23 6 days ago | parent | prev | next [-] | | I watched the announcement yesterday and was very surprised to hear the watch battery life is still so shocking. Especially considering how useful sleep data is, then I was surprised to see they're only getting sleep scores now. My dirt cheap Huawei watches have had this for years. It's accurate enough (my own perception based on use). And I get a weeks battery life too (although I don't have the distracting fancy notifications perhaps). It does check blood oxygen levels, heart rate, stress etc. I truly thought this was a solved problem (looking at headphones battery life, although I might need to check my assumptions here also apply to Airpods). | | |
| ▲ | ralfd 5 days ago | parent [-] | | The watch has sleep data (for example phase durations like rem sleep and apnoe), the health app just doesn’t compute a „score“. > I truly thought this was a solved problem. I charge when showering in the morning. 15 minutes is enough for the day + night, half an hour to charge it fully. | | |
| |
| ▲ | seesaw 6 days ago | parent | prev | next [-] | | I switched from Apple Watch to a Garmin Venu. The battery lasts for a week, and many of the sensors are more accurate. | | |
| ▲ | wooger 6 days ago | parent [-] | | And that's the fancy screen, gimmick edition garmin watch - the normal MIP display garmin watches (even an old, midrange Forerunner 255) will easily get a couple of weeks of battery life, more for the higher end ones. OLED is just the wrong screen tech for these devices, never made any sense to me given how little I care about graphics and how little time I spend reading the display. | | |
| ▲ | gf000 4 days ago | parent [-] | | But it's not the screen that causes it to lose energy as fast, but the general purpose OS with a decent CPU. |
|
| |
| ▲ | wahnfrieden 6 days ago | parent | prev [-] | | New one is 24 hours is that still atrocious | | |
| ▲ | SirMaster an hour ago | parent | next [-] | | The new one isn't actually longer. It's just that they changed how they measure it. It assumes 16 awake hours and 8 asleep hours, so the watch lasts 24 hours, but only when you are sleeping and thus not using it for 8 hours. | |
| ▲ | klardotsh 6 days ago | parent | prev | next [-] | | Yes. My Pebble Steel got over a week of battery in 2015, had physical, tactile buttons that worked even wearing thick winter gloves, and had an always-on-no-matter-what screen that was clearly readable in full sunlight. Every smartwatch that hasn't met that bar, which is almost all of them ever made, is a joke to me. I'd have ordered a RePebble had I not moved back to analogue dumbwatches instead just before they were announced (and were iOS not actively hostile to competing watch implementations). | | |
| ▲ | brookst 6 days ago | parent | next [-] | | And motorcycles get way better gas mileage than cars. But it’s still odd to frame a (totally understandable!) preference for one product category in those terms. | |
| ▲ | qwezxcrty 6 days ago | parent | prev | next [-] | | If you are okay with less smart smart watches, and okay with no hackability, Garmin should have a few with black and white display and >1 week battery life (even indefinite with sufficient solar). | |
| ▲ | qwertytyyuu 6 days ago | parent | prev | next [-] | | That’s not really the same category of device | |
| ▲ | wahnfrieden 6 days ago | parent | prev [-] | | Isn’t that a laggy b&w screen, with no ability to respond to notifs, no cellular. I guess those are ok for some users |
| |
| ▲ | monkeywork 6 days ago | parent | prev | next [-] | | depends which camp of apple watch (or smart watch in general) users you are asking. the camp that sees the smartwatch as an accessory to their smartphone that does fitness tracking and maybe a few other useful things to avoid pulling their phone out constantly - those people want MUCH longer battery life. the camp that sees the smartwatch as a REPLACEMENT to their smartphone, they are perfectly fine with the current battery life. | | |
| ▲ | Oreb 6 days ago | parent [-] | | I am closer to the first camp than the second, and I don’t understand why I would need longer battery life. The watch charges very quickly, and there is never a day when I don’t have the chance to charge at some point. I usually do it during my morning shower. | | |
| ▲ | wooger 5 days ago | parent [-] | | 1. People use these GPS watches for Ironman triathlons, ultra running & cycling events etc. They can't and won't charge before the battery is done - and remember the battery with a daily charge will degrade significantly. If it's borderline on release, it'll be inadequate after a year. 2. Just for general convenience, having to take another special cable for every late night or overnight trip is maddening. I always have a phone anyway for any actual interactions. I find it hard to believe many people are writing texts on their watches, it's just a nice to have gimmick feature that everyone I know has stopped using. | | |
| ▲ | gf000 4 days ago | parent | next [-] | | > and remember the battery with a daily charge will degrade significantly. If it's borderline on release, it'll be inadequate after a year. That has not been my experience though - having used both an Apple Watch and a Pixel Watch for years on end every single day. Absolutely outside my area of expertise, but I would imagine that you can design batteries to have a much longer lifetime (no of recharge cycles) when their capacity is smaller. | |
| ▲ | NetMageSCW 4 days ago | parent | prev [-] | | That’s not how Lion charging works - degradation and lifetime (to a first approximation) depend on full charges. If you charge daily from 80% to 100% or charge every 5 days from 1% to 100%, your battery degradation and lifetime will be the same. |
|
|
| |
| ▲ | ericd 6 days ago | parent | prev | next [-] | | Yep, easily the worst part of mine, especially since it has to charge at a different time than my phone to allow for sleep tracking. | |
| ▲ | brewdad 5 days ago | parent | prev | next [-] | | My biggest complaint with my Apple Watch is that I have to choose between sleep tracking and being able to wear my watch all day. | | |
| ▲ | SirMaster an hour ago | parent [-] | | Why? You can get 8 hours of sleep tracking for a 5 minute charge. You really can't charge your watch for 5 minutes before bed? How about during your bathroom routine? You are brushing your teeth for like half that alone. |
| |
| ▲ | KiwiJohnno 6 days ago | parent | prev | next [-] | | Yes, my 5 year old Garmin still lasts about 10 days. And thats with using GPS tracking + bluetooth audio for multiple recorded activities. | |
| ▲ | whatevaa 6 days ago | parent | prev | next [-] | | Yes. Simply yes for a lot of people. | | |
| ▲ | wahnfrieden 6 days ago | parent [-] | | Are those people who don’t need interactivity, ability to respond to notifs, cellular, etc or are you comparing with something comparable | | |
| ▲ | michaelt 6 days ago | parent [-] | | I think a lot of people reach into their pocket and get their phone out if they need "interactivity, ability to respond to notifs, cellular, etc" But if you want to leave your smartphone at home, but you still want cellular and notifications, I agree the apple watch is the only game in town even if the battery life sucks. |
|
| |
| ▲ | zdw 6 days ago | parent | prev | next [-] | | Most of this is because of the always-on screen. If you can live without it and switch back to the motion or button to wake mode, you get 30-50% more usage before the battery runs out, which is not a huge improvement but is a legitimate option. A side effect is that this makes your watch look less new, and therefore less of a theft target. | |
| ▲ | numpad0 6 days ago | parent | prev [-] | | real watches last like 24 months minimum | | |
| ▲ | swores 6 days ago | parent [-] | | And bicycles go much further without needing petrol than cars. I agree that Apple Watches don't last long enough between charges, but comparing them to a completely different class of device that's technically the same broad category is pointless. |
|
|
|
| |
| ▲ | criley2 6 days ago | parent | prev | next [-] | | Is this a thing? I've been using a Pixel 9 Pro Fold for one full year now and my inner screen looks pretty flawless. I don't see a single scratch, and I've never used any kind of protector on the inside. This kind of sounds like a "sour grapes" excuse where a really good thing is presumed to suck only because you can't have it. Personally, as someone who isn't really interested in a full tablet, the foldable is really really nice. | | |
| ▲ | ndiddy 6 days ago | parent | next [-] | | > Is this a thing? From Google's official Pixel 9 Pro Fold handling instructions: (https://support.google.com/pixelphone/answer/15090466?hl=en#...) > Flexible screens are softer than traditional phone screens, so avoid contact with sand, crumbs, *fingernails* and sharp objects. | |
| ▲ | amelius 5 days ago | parent | prev | next [-] | | -> sour apple I'm sure that if Apple invented the exact same thing in the exact same way, it would have been the "greatest thing since sliced bread". | |
| ▲ | fumar 6 days ago | parent | prev [-] | | I have an OG Pixel Fold and the inner screen is flawless. My iPhone 14 Pro screen is visibly scratched. The Fold replaced tablets and e readers for me. |
| |
| ▲ | dankwizard 6 days ago | parent | prev | next [-] | | Or an actual seemless hinge. My god are they ugly down the bend. | | |
| ▲ | baby 6 days ago | parent [-] | | it's invisible if you look straight at your phone, never bothered me | | |
| ▲ | dankwizard 5 days ago | parent [-] | | Unfortunately my eyes can see it. | | |
| ▲ | baby 5 days ago | parent [-] | | Look at the latest model, your eyes won't see it. Stop spreading FUD | | |
| ▲ | dankwizard 5 days ago | parent [-] | | I don't know what to tell you - I don't want to brag about my eyesight, but it's pretty good - No matter what angle, no matter what phone, the crease is visible. What would I have to gain lying about this? I could say the same thing - Stop trying to copium your purchase? The tech isn't there yet. | | |
|
|
|
| |
| ▲ | Theodores 6 days ago | parent | prev [-] | | It is a feature, not a bug. For those that are not chronically online, a mobile phone from a decade ago has everything they need. If you only have to phone the family, WhatsApp your neighbours, get the map out, use a search engine and do your online banking, then a flagship phone is a bit over the top. If anything, the old phone is preferable since its loss would not be the end of the world. I have seen a few elderly neighbours rocking Samsung Galaxy S7s with no need to upgrade. Although the S7 isn't quite a decade old, the apps that are actually used (WhatsApp, online banking) will be working with the S7 for many years to come since there is this demographic of active users. Now, what if we could get these people to upgrade every three years with a feature that the 'elderly neighbour' would want? Eyesight isn't what it used to be in old age, so how about a nice big screen? You can't deliberately hobble the phone with poor battery life or programme it to go slow in an update because we know that isn't going to win the customer over, but a screen that gets tatty after three years? Sounds good to me. | | |
| ▲ | epolanski 6 days ago | parent | next [-] | | > the apps that are actually used (WhatsApp, online banking) will be working with the S7 for many years to come I have several apps that no longer work on my otherwise good phone bought in 2018 because I can no longer update the OS that they require. | | |
| ▲ | bayindirh 6 days ago | parent [-] | | Can you give any examples? My apps only stop upgrading, not stop working out of the blue. Edit: This is a honest question. | | |
| ▲ | epolanski 6 days ago | parent [-] | | Banking apps are a common example that requires you to be on latest, yet my phone is stuck in Android 10 land. Whatsapp also no longer works on it, thus the phone is useless. Which is sad, as it has a great camera, battery life and is very light. | | |
|
| |
| ▲ | ta1243 5 days ago | parent | prev | next [-] | | It's the software updates that's the problem. Apple aren't too bad, but their hardware support only seems to last 7 years. The S7 you mention lasted 4 years, and received the last patch in 2020. Not convinced that doing online banking on a phone that hasn't had software updates for 5 years is a good idea. | | |
| ▲ | pdimitar 3 days ago | parent [-] | | It's definitely a bad idea but people hate being coerced into spending $500+ just to continue doing what they already do. A lot of us the techies can be strong-armed via FOMO and other tropes but good luck convincing the elderly neighbors. |
| |
| ▲ | somewhereoutth 6 days ago | parent | prev [-] | | Samsung Galaxy A40 checking in. It's small, has dual sim card sockets, and a headphone jack. I'm not sure how I'd replace it to be honest. | | |
| ▲ | autoexec 6 days ago | parent | next [-] | | I'm still want a phone with expandable storage and a headphone jack. Sony had one, but I don't know if they're selling them and I've heard they have their own issues too. | |
| ▲ | Nursie 6 days ago | parent | prev [-] | | Honest question here - is there a situation where you need to be able to use the headphone jack and USB-C at the same time? Because there are very cheap, lightweight adaptors to headphone jack from USB-C. | | |
| ▲ | somewhereoutth 5 days ago | parent | next [-] | | One extra 'thing' to need - at the moment I know that I can play music through anything that has a line-in, with just a cable. However Bluetooth seems to work ok - for devices that support it. | |
| ▲ | smelendez 6 days ago | parent | prev [-] | | Not OP but my concern is putting strain on the charging port by walking with headphones while my phone is in my pocket. Wireless chargers are pretty good but it’s still a pain to wear out your port. | | |
| ▲ | Nursie 6 days ago | parent [-] | | There are some 90-degree adapters that would probably minimise that. I can dual-SIM my iphone by using one e-Sim and one physical. The only thing it is not, is small... |
|
|
|
|
| |
| ▲ | bayindirh 6 days ago | parent | prev | next [-] | | > What are the PMs doing at Apple. Probably trying to find better screen materials, and addressing reliability issues. I used Palm devices with resistive touch screens. It was good, but when you go glass, there's no turning back. I would never buy a phone with folding screens protected by plastic. I want a dependable slab. Not a gimmicky gadget which can die any moment. I got my fix for dying flex cables with Casiopeia PDAs. Never again. | | |
| ▲ | baby 6 days ago | parent [-] | | my girlfriend broke her iphone screen twice in two weeks, the second time we didn't bother repairing the screen and now she has a broken screen which looks really ugly. I've dropped my google pixel fold 9 countless time and the screen is still intact and flawless. So not sure what you're talking about. | | |
| ▲ | 2muchcoffeeman 6 days ago | parent [-] | | I’ve dropped my iPhone 15 more times than any phone I’ve ever had. Still fine. I don’t know how I got away with it. Am I representative? Dunno. | | |
| ▲ | Thlom 5 days ago | parent | next [-] | | I've dropped my iPhone 12 a million times with no screen protection or cover. Still fine apart from some scratches around the edges. | |
| ▲ | bayindirh 5 days ago | parent | prev [-] | | My and my wife's iPhones took numerous dives over the years. They're fine, too. |
|
|
| |
| ▲ | erikpukinskis 6 days ago | parent | prev | next [-] | | Folding phones are ~1.5% of the market. Apple cancelled their mini line which was 3% of sales. It’s not a big enough slice for them to want to chase. | | |
| ▲ | icedchai 6 days ago | parent | next [-] | | I prefer a smaller phone, something that fits in your pocket easily with glasses, and am still rocking an iPhone Mini 13. | | |
| ▲ | ansc 6 days ago | parent | next [-] | | I am getting more and more nervous that there will be no good upgrade for me. 13 Mini is such a good size! | | | |
| ▲ | vizzah 6 days ago | parent | prev [-] | | yeah.. and I am buying $2k unopened on eBay to keep for the future, if my current one is lost. |
| |
| ▲ | epolanski 6 days ago | parent | prev | next [-] | | Folding phones are a niche because they are very expensive to be honest. | | |
| ▲ | SirMaster 44 minutes ago | parent | next [-] | | The Samsung Flip 7 costs $900 and is less than most iPhones... | |
| ▲ | 0x457 5 days ago | parent | prev | next [-] | | Sure, but I wouldn't buy one even if it was in the same price range as phones I usually buy. For me, it will be useful rarely and cumbersome to use the rest of the time. | |
| ▲ | dbg31415 6 days ago | parent | prev | next [-] | | I picked up a folding phone a while back just to test it out, and honestly they're still pretty underwhelming. The screen isn't really big enough or the right shape to feel like a real upgrade for movies, and a lot of apps just aren't built with foldables in mind. Most of the time it just feels like a weirdly shaped, less powerful, less durable tablet. On top of that you're dealing with a visible crease across the screen, higher prices for something that's actually more fragile, and bulkier hardware with smaller or split batteries. The tech is cool in theory, but in practice it's a lot of compromises without a clear killer use case. | | |
| ▲ | baby 6 days ago | parent [-] | | which phone was that? I bought the pixel folding 9 last year and it has basically replaced my ipad pro. I watch movies, shows, youtube videos, read PDFs on it, it's really good | | |
| ▲ | dbg31415 6 days ago | parent [-] | | Samsung Galaxy Z Fold. | | |
| ▲ | ulfw 5 days ago | parent | next [-] | | Things have evolved a ton. I've got an Oppo Find N5. Thinner than iPhone Air when unfolded. Same size as iPhone 16 Pro Max when folded. 16GB Ram, fastest Snapdragon, okay cameras, the screen is magnificent, crease basically invisible in day to day use. Battery larger than any iPhone battery (thanks to Silicon Carbon) | |
| ▲ | baby 6 days ago | parent | prev [-] | | The tech has got really good really quickly! |
|
|
| |
| ▲ | arcticbull 6 days ago | parent | prev [-] | | I have a folding android and it’s very meh. Wouldn’t get one again. It was also free with a prepaid phone plan so I doubt cost is really the factor. | | |
| ▲ | swores 6 days ago | parent | next [-] | | Free with a plan just means you paid for it in installments without them breaking down how much of your monthly payment is going towards the device vs towards the network use. Had you opted for a cheaper device you could have got the same plan for less money. The phone is never actually free, just cleverly marketed to seem free. | |
| ▲ | epolanski 6 days ago | parent | prev | next [-] | | Good foldables are way above the $ 1000 mark. | | |
| ▲ | SirMaster 43 minutes ago | parent | next [-] | | Is the Flip 7 not a good foldable? It's less than $1000. | |
| ▲ | TeMPOraL 6 days ago | parent | prev | next [-] | | I recently got one of these (Galaxy Z7 Fold) and I can't imagine ever going back to a regular phone. The big screen is what makes the phone finally begin to resemble actual productivity tool. | |
| ▲ | arcticbull 6 days ago | parent | prev [-] | | What makes a good foldable better than say a $700 RAZR? | | |
| ▲ | Miraste 6 days ago | parent [-] | | They're tablet sized screens folding to phone size, instead of phone size folding to half phone size. | | |
| ▲ | Nevermark 6 days ago | parent [-] | | Apple should create 1.5x1.5 and 2x2 inch variations of a wrist "Panel Watch Ultra". Be great for diving - and everything else. That would be the half sized phone I would buy. |
|
|
| |
| ▲ | nicoburns 6 days ago | parent | prev [-] | | > It was also free with a prepaid phone plan It's not really free. It's just built in to the cost of your plan. Your plan would be half the price if you weren't paying for the phone. | | |
| ▲ | johnisgood 5 days ago | parent [-] | | Yeah, why do people call it free? You do pay for the phone, just not the full amount upfront. | | |
| ▲ | arcticbull 5 days ago | parent [-] | | It was a prepaid plan that was the same price whether I got the phone or not. I guess you could say everyone who didn’t get the phone was subsidizing those who did, but there’s no way to opt for lower pricing if you BYOD. So no, in this case that’s not really true. If it were Verizon where you can pay less if you BYOD then sure but that’s not what I did. | | |
| ▲ | johnisgood 4 days ago | parent [-] | | > It was a prepaid plan that was the same price whether I got the phone or not. Fair enough, then it makes sense to get the phone. |
|
|
|
|
| |
| ▲ | catach 6 days ago | parent | prev | next [-] | | > It’s not a big enough slice for them to want to chase. Typical strat for them is not to be first with an innovation, but to wait and work out the kinks enough that they can convince people that the tradeoffs are well worth making. Apple wouldn't be chasing that existing slice, they'd be trying to entice a larger share of their customers to upgrade faster. | |
| ▲ | baby 6 days ago | parent | prev | next [-] | | Folding phones are also double the price. If the price comes down I would expect them to dominate the market. | |
| ▲ | amelius 5 days ago | parent | prev | next [-] | | Yes, in some way everybody is in the 1.5% of something. Apple users will therefore never be 100% happy. Apple is a compromise. But they're also opinionated and very good at telling their users what they should like. | |
| ▲ | cyberax 6 days ago | parent | prev [-] | | Folding phones are extremely popular in China, where nobody cares about Apple anymore. They are now seen as a status symbol because they are significantly more expensive. |
| |
| ▲ | jsheard 6 days ago | parent | prev | next [-] | | I think they'd rather sell you an iPhone and an iPad Mini rather than one device that does both, just like they'd rather sell you an iPad Air/Pro and a MacBook with basically the same internals, rather than a convertible macOS tablet. | | |
| ▲ | baby 6 days ago | parent [-] | | I basically stopped using my ipad pro since I bought the pixel folding pro |
| |
| ▲ | meindnoch 6 days ago | parent | prev | next [-] | | Aside from the obvious mechanical issues, the screen quality compromises, et cetera, folding phones are just dorky. Apple wants their products to be anything but dorky. There will never be a folding iPhone, simple as. | | |
| ▲ | rafaelmn 6 days ago | parent | next [-] | | Apple watch is like the definition of dorky looking - so much for that theory. Also flip phones aren't dorky and have a 2000s vibe - but they don't fit Apple "you can have any color as long as it's black" approach to design. In some ways I can't even fault them - fragmenting your device shapes/experiences to chase a niche look is not good business. But this is exactly what's pushing me out of Apple ecosystem - it's so locked down that if you don't want to fit into their narrow product lines you have no other options. There are no third party watch makers using apple watch hardware and software. No other phone makers with access to iPhone internals and iOS. Nobody can hack a PC OS onto an iPad or build a 2in1 MacOS device. I feel like this is the last gen of Apple tech I'm in on - I just find there are so many devices that are compelling to me personally but don't fit into the walled garden. Plus Google seems light-year ahead on delivering a smart assistant. | | |
| ▲ | seec 6 days ago | parent | next [-] | | I'm with you. Long term Apple customer and it feels like they really don't care about anything that I would like them to do. It's OK but it feels bad because you are kind of trapped with their stuff if you invested in their ecosystem. | |
| ▲ | FirmwareBurner 6 days ago | parent | prev | next [-] | | >Apple watch is like the definition of dorky looking Meanwhile my Casio calculator watch: "bonjour" | | |
| ▲ | rafaelmn 6 days ago | parent [-] | | I was going to write that the only nerdier thing I can think of is wearing a calculator watch - but even that's like nerd fashion and having a rectangular screen strapped to your wrist is just all about utility. | | |
| ▲ | NetMageSCW 4 days ago | parent [-] | | Rectangular watches have been around for over 100 years and have just been regular fashion for much of that time: https://teddybaldassarre.com/blogs/watches/rectangle-watches... | | |
| ▲ | rafaelmn 4 days ago | parent [-] | | If you mistake any of these for an apple watch at less than 100m you need glasses. There's nothing wrong with rectangular watches - a fat bezel less screen rectangle around your wrist is not the same thing. The pro comes closest to a proper watch look but even that's "inspector gadget" teritory, not fashion accessory. |
|
|
| |
| ▲ | bigyabai 6 days ago | parent | prev | next [-] | | Don't know why you're downvoted. My boyfriend wears the Apple Watch Ultra in public and looks like a complete dork. He's got a pretty big wrist, too! I left the ecosystem after Catalina, and my experience with macOS at work has horrified me enough to stay well away. Nowadays I'm happily using NixOS on the desktop, laptop and homeserver. My biggest gripe is that I didn't switch sooner, probably could have saved a decent amount of cash eschewing the Apple tax, SaaS fees and macOS migration hamster-wheel. | |
| ▲ | RyanOD 6 days ago | parent | prev | next [-] | | I'm going to respectfully disagree with the Apple Watch being labeled "dorky". I think they look pretty nice - and I don't own one. I wear a Timex Ironman. | | |
| ▲ | rafaelmn 6 days ago | parent | next [-] | | There's no mistaking it for any watch out there - which means people wouldn't wear a watch like it if it wasn't for the function. | |
| ▲ | dingnuts 6 days ago | parent | prev [-] | | I definitely think everyone with an Apple Watch looks like a schmuck |
| |
| ▲ | hombre_fatal 6 days ago | parent | prev [-] | | That Apple Watch is ubiquitous suggests that it's not seen as dorky. Apparently in 2022, 80% of iPhone owners also had the AW. | | |
| ▲ | macNchz 6 days ago | parent | next [-] | | I found this stat a little hard to believe so I looked up what appears to be the source—it’s 81% of iPhone owners who own a smart watch https://www.statista.com/chart/31973/likelihood-of-iphone-us... It’s hard to find a source of how many iPhone owners specifically also own a smartwatch, but in the US it seems like 35% might be a decent estimate of smartwatch ownership, so it’d be more in the realm of ~28% of iPhone owners also having an Apple Watch. | | |
| ▲ | hombre_fatal 4 days ago | parent [-] | | True, it did seem a bit unbelievable. Either way, if you look around, Apple Watch is worn by all walks of life and just doesn't have the dorky vibe HNers might insist it has. |
| |
| ▲ | bigyabai 6 days ago | parent | prev [-] | | 100% of iPhone users also use the App Store. Anyone who owns a Mac will tell you that's not due to immense satisfaction or competitive zeal, though. |
|
| |
| ▲ | noarchy 6 days ago | parent | prev | next [-] | | Watch the leaks over the next year or so. There have been rumours of a foldable coming as soon as next year. | |
| ▲ | baby 6 days ago | parent | prev [-] | | I remember thinking that the first iPad was dorky, oh boy did I misread the market. Oh and I remember everyone mocking the airpods pro when they came out. Now everyone is wearing them. For phones what really matters for most people is... the screen size. And a folding phone is basically the best thing you can get right now for that. The only problem is pricing at the moment. |
| |
| ▲ | Miraste 6 days ago | parent | prev | next [-] | | They're in the right. Folding phones are great, and I've used one for years, but the technology hasn't reached Apple levels. Get rid of the crease, make the screen less scratchable, and make them waterproof, and then it could go in an iPhone. | |
| ▲ | boppo1 6 days ago | parent | prev | next [-] | | Folders seem gimmicky to me | | |
| ▲ | km3r 6 days ago | parent | next [-] | | Im never going back to non-foldable. The ability to have a full sized phone take up half as much space in my pocket is amazing. Consistently more comfortable moving around. | | |
| ▲ | brulard 6 days ago | parent | next [-] | | Maybe half length, not sure half space. | | |
| ▲ | eloisant 6 days ago | parent | next [-] | | Phones are already thin enough, I don't mind doubling the thickness. Length is the problem. | |
| ▲ | qwertytyyuu 6 days ago | parent | prev [-] | | Yea the zfold style is the way to go |
| |
| ▲ | ricardobeat 6 days ago | parent | prev [-] | | That’s because most Android phones are tablet sized. We could simply have smaller phones. |
| |
| ▲ | baby 6 days ago | parent | prev | next [-] | | Until you use one | | |
| ▲ | dmix 6 days ago | parent | next [-] | | I tested the Samsung one in the store and that groove thing in the screen would drive me crazy | | |
| ▲ | baby 4 days ago | parent [-] | | I have a google fold 9 for a year and I've never noticed it unless I look at the phone from the side. It's interesting that this is the criticism that comes up the most here where it's already been a solved issue | | |
| |
| ▲ | sniffers 6 days ago | parent | prev | next [-] | | They have a nightmare of a crease. Every single one. Even slight warping causes me to recoil. No, I've used one, they are absolutely unusable for me. | | |
| ▲ | baby 6 days ago | parent [-] | | You can't see the crease anymore. Source: I own a pixel folding 9 pro | | |
| ▲ | sniffers 5 days ago | parent [-] | | If you hold it up to light to get a reflection, you are telling me there's zero perceptual warping of that reflection around the crease? None? It's as flat and perfect as a single sheet of glass? |
|
| |
| ▲ | humpty-d 6 days ago | parent | prev | next [-] | | I'd agree if there were fewer compromises required to pull it off. | |
| ▲ | pwthornton 6 days ago | parent | prev | next [-] | | What’s the main use case of this? | | | |
| ▲ | jjtheblunt 6 days ago | parent | prev [-] | | so what do you use on yours with more dexterity than without it? | | |
| ▲ | baby 6 days ago | parent [-] | | watching videos and reading PDFs (whitepapers) are the two big upgrades, being able to take selfies and see yourself is often useful also |
|
| |
| ▲ | ls-a 6 days ago | parent | prev | next [-] | | We're already in the trifold era. Check this video to see some useful features https://www.youtube.com/watch?v=vp5i0jQggK4 | |
| ▲ | bigyabai 6 days ago | parent | prev [-] | | iPads seem gimmicky to me. Somehow, they sell... | | |
| ▲ | __loam 6 days ago | parent | next [-] | | It's great for watching shows on a stationary bike, reading manga, and as a drawing tablet. There's a bunch of artists that only use procreate. | | |
| ▲ | bigyabai 6 days ago | parent [-] | | It's a hard sell for curmudgeons like me with a laptop that does everything you listed and more. Maybe I'm the idiot, but you won't catch me dead paying laptop money for a neuter-computer. | | |
| ▲ | Miraste 6 days ago | parent | next [-] | | I don't think this is fair. Of the uses listed: iPads are better for watching shows on a stationary bike, since they fit on the bike iPads are better for reading manga, since you can hold them vertically and iPads are clearly better for drawing--you can't draw on a laptop. There are some hybrid laptops that do these things, but they're bad at them. Especially drawing, I've used enough HP convertibles with "stylus support" over the years to know that. | |
| ▲ | __loam 6 days ago | parent | prev | next [-] | | Hey good for you man. It's still one of the most popular drawing tablets on the market. | | |
| ▲ | bigyabai 6 days ago | parent [-] | | The sous vide is one of the most popular ways to prepare a steak. It still doesn't replace even a 10th of what my kitchen is capable of. | | |
| |
| ▲ | skhr0680 6 days ago | parent | prev [-] | | This, especially nowadays that Mac OS has an ARM target, and there’s essentially (literally?) no difference between an iPad and MacBook hardware | | |
| ▲ | addaon 6 days ago | parent [-] | | > there’s essentially (literally?) no difference between an iPad and MacBook hardware Form factor. Touch screen. GPS. Cellular. Circular polarization. These are all literal hardware differences between the iPad and MacBook, and every single one of them makes the iPad suitable for my use case (ForeFlight running on an iPad mounted to the yoke) where a MacBook would not be. | | |
| ▲ | bigyabai 6 days ago | parent [-] | | You can get a laptop with all those built-in, though. The only reason the Mac lacks those things is artificial market segmentation. | | |
| ▲ | addaon 5 days ago | parent | next [-] | | Also, can you give an example of a laptop (or non-Apple tablet) with a circularly polarized LCD? I've never been able to find one, but it's not a spec that's often published… | |
| ▲ | addaon 6 days ago | parent | prev [-] | | Sure, but none of them run ForeFlight, so… |
|
|
|
|
| |
| ▲ | lisbbb 6 days ago | parent | prev | next [-] | | Schools--for better or for worse, schools buy gobs of ipads. | |
| ▲ | 8note 6 days ago | parent | prev | next [-] | | i see one at basically every store or bar as an easily configurable POS | |
| ▲ | brulard 6 days ago | parent | prev [-] | | Are you serious? For anything that needs more screen estate - reading, browsing, photo/video watching/organizing, or simply if your sight is not as good anymore, it's so much better than phone. And with the pricetag around $350 that is amazing value. |
|
| |
| ▲ | yoyohello13 6 days ago | parent | prev | next [-] | | The PMs are probably thinking folding phones are dumb…because they are. | | |
| ▲ | ls-a 6 days ago | parent [-] | | Someone else commented that the reason the iPhone Air is so thin is the result of Apple building a folding phone (they have to be thin). I agree. The iPhone Air basically looked like a low hanging fruit while they're still at it. Apple is known to take its time so that makes sense |
| |
| ▲ | nylonstrung 6 days ago | parent | prev | next [-] | | Marques Brownlee said they have prototypes for a folding phone and will likely release one | |
| ▲ | swiftcoder 6 days ago | parent | prev | next [-] | | Do any of the folding phones actually work well? I still haven't seen one in the wild (admittedly, I'm not living in a tech Mecca these days) | | |
| ▲ | dboreham 6 days ago | parent | next [-] | | I've had the past three generations of Samsung folding phone (4,5,6). My use-case is for travel, where I want to read books, and the very occasional time when I want to do some design work outside the office -- draw a diagram that sort of thing. A third rare use case is where a web site is buggy or limited in functionality for mobile browsers. In all these cases the unfolded screen allows me to do the thing I need to do without carrying a second device (tablet, eReader). Another marginal use-case is to show another person a photograph. The fold out screen is much easier to see and I think has better color rendition too. For these use-cases I find the folding phone very worthwhile. But...the benefit that trumps all that is that the phone itself is smaller (narrower) than the typical flagship phones these days. It fits in my pocket and my hand reaches across it. I'd never go back to a non-folding phone for this reason alone, even if I never unfolded it. In fact I almost never do unfold it, except when traveling. fwiw it wasn't until the Fold6 that the "cover screen" typing experience was ok. I understand that the Fold7 is a bit wider and so probably better, but I can't justify the expense to upgrade so will sit out until the Fold8. | |
| ▲ | carstenhag 6 days ago | parent | prev | next [-] | | Tried the Fold on a Google event and it was really nice. I would get one, but I don't want to spend so much money. | |
| ▲ | theshackleford 5 days ago | parent | prev | next [-] | | The Z Fold 7 I tried works so well, I tempted to move away from Apple for the first time since my Galaxy Note 3. | |
| ▲ | dingnuts 6 days ago | parent | prev | next [-] | | they do work well but are fragile. I broke one by gently closing it on a hot day (about 100F). Saw another break from the kind of short fall that used to break phones before they all got gorilla glass. I guess if you're the sort that is not clumsy and you're in a mild climate you might get your money's worth for reference these were Samsung Z Flip devices | | |
| ▲ | baby 6 days ago | parent [-] | | Owner of the pixel 9 folding here and I drop it constantly, no issues |
| |
| ▲ | gdbsjjdn 6 days ago | parent | prev | next [-] | | The vertical fold ones might be better. I had the newest Samsung Flip (horizontal folding) and the screen died twice. Both times from a small rupture on the seam. The tech at the phone place said it happens constantly, and it costs hundreds of dollars to replace out of warranty. | |
| ▲ | baby 6 days ago | parent | prev [-] | | The google folding pro works really well |
| |
| ▲ | caycep 6 days ago | parent | prev | next [-] | | I dunno, I always felt folding phones added unnecessary complexity and moving parts. The slab phone seems closer to a platonic ideal and from a user/engineering perspective, has less compromises | | |
| ▲ | baby 6 days ago | parent [-] | | honestly most criticism I see on folding phone could basically be solved by just trying one, it's so useful when you need a larger screen |
| |
| ▲ | runako 6 days ago | parent | prev | next [-] | | In all seriousness, is there a folding phone that doesn't have a crease in the screen while unfolded? The one I have used felt like using a real phone through a layer of vinyl, definitely not a pleasant experience. | | |
| ▲ | TeMPOraL 6 days ago | parent [-] | | The crease is something you barely even notice 5 minutes into using one. |
| |
| ▲ | rickdeckard 6 days ago | parent | prev | next [-] | | > IMO it's underwhelming considering folding phones have been out for many years now and we still don't have a folding iPhone. What are the PMs doing at Apple. They're buying another year of very-high margin phones I guess... | |
| ▲ | busymom0 6 days ago | parent | prev | next [-] | | I know they have been out for a while but I have yet to see a single one in person. They just don't make much of the market. | |
| ▲ | pdntspa 6 days ago | parent | prev [-] | | Why do we need a folding phone? | | |
| ▲ | bayesianbot 6 days ago | parent | next [-] | | I would never have bought one before but nowadays it could actually be useful. You could have Codex or Claude Code in your pocket, and every ~15min check the work and write a new prompt. Tablets are too big (for me) to constantly carry around for this, and phones annoyingly small for that use. | |
| ▲ | baby 6 days ago | parent | prev [-] | | because we want larger screens | | |
| ▲ | pdntspa 6 days ago | parent [-] | | what? why? they're already bigger than one hand -- way too big! Get a computer | | |
| ▲ | 0x457 5 days ago | parent | next [-] | | It caters to a very specific target group: voice calls[1] and looking at a lot of information at once[2]. [1]: Mediocre folded experience doesn't bother them [2]: Think calendars and whatever else middle managers look at | |
| ▲ | baby 5 days ago | parent | prev [-] | | Why get a computer if you just want to read or watch videos? | | |
| ▲ | pdntspa 5 days ago | parent [-] | | youtube with ublock origin is a game changer. Or you can pull up VLC and peruse your collection of locally-stored content | | |
| ▲ | gf000 4 days ago | parent [-] | | On Android you can just patch and recompile the native youtube app on your phone to disable ads with ReVanced. |
|
|
|
|
|
|