| ▲ | postalcoder 7 days ago |
| To those wondering why the MacBook would have a sensor for this, it’s likely there to support Desk View[0]. It shows the items on your desk in a geometrically correct, top-down view. Knowing the angle of the display is very helpful when applying keystone correction. 0: https://support.apple.com/en-us/121541 |
|
| ▲ | OJFord 7 days ago | parent | next [-] |
| Simpler than that I think - when do you turn off the screen or sleep? Because it isn't fully closed, but you want to be able to 'privacy-duck' the screen a bit before that, and having a sensor rather than just a fixed angle switch makes it software defined and something they can update. |
| |
| ▲ | hamandcheese 7 days ago | parent | next [-] | | I'm pretty sure the sensor for that is a simple reed switch. | | |
| ▲ | OJFord 7 days ago | parent | next [-] | | A reed switch (plus magnet and choice of location) would be an implementation of a 'fixed angle switch' per my comment above. | | |
| ▲ | vasco 6 days ago | parent [-] | | If you approach something metalic to the top of the base in the left side of most macbooks you can feel where the magnet is. They either have both systems or maybe they switched this recently. | | |
| ▲ | OJFord 5 days ago | parent [-] | | Presence of a magnet doesn't imply presence of a reed switch - are you sure that's not just to give it some resistance to opening for example? Or angle sensing could be implemented with a magnet and Hall effect sensor. |
|
| |
| ▲ | tesseract 7 days ago | parent | prev | next [-] | | More likely a hall effect sensor, which is solid state and a lot smaller. And yes, older MacBooks had something like that, as evidenced by the fact you could put them to sleep by holding a magnet in the right place (just to the left of the trackpad IIRC in the models I'm familiar with) | | |
| ▲ | 0_____0 6 days ago | parent [-] | | I pranked a coworker once by sticking a magnet to his desk somehow to get his macbook to sleep when his computer was in a certain spot. | | |
| ▲ | rafaelmn 6 days ago | parent [-] | | Nice one ! Curious since I know almost nothing about HW - do magnets screw with computer HW otherwise ? I would guess no since we don't use HDD anymore but not sure. | | |
| ▲ | Johnbot 6 days ago | parent [-] | | As far as I know, even HDDs were pretty resilient to magnets when in their enclosures. I once took a large magnet meant for holding together concrete forms, one strong enough that it stuck to a ferrous surface it could probably support my weight, and stuck it to a hard drive for a full year to see if it'd break. The drive, as well as all of the data on it, were fine. |
|
|
| |
| ▲ | rzzzt 7 days ago | parent | prev [-] | | When I ran a MacBook Pro in closed clamshell mode and put another laptop on top of it, it went to sleep. Must be a weight sensor in there as well. (/s) | | |
| |
| ▲ | kelnos 7 days ago | parent | prev | next [-] | | Why though? That seems unnecessarily complex? It seems fine to me to just use a reed switch and sleep when it's closed or very close to closed. | | |
| ▲ | missinglugnut 7 days ago | parent | next [-] | | It's one sensor in both cases, and in the latter case you can do so much more: change the thresholds in an update, detect when the lid is in the process of closing, apply hysteresis (on a simple switch, there's an angle where vibration could cause it to bounce between reading open and closed, but with an angle sensor you can use different thresholds for detecting and open and closing state change). But most of all...you don't have to commit to a behavior early in the design process by molding the switch in exactly the right spot. If the threshold you initially pick isn't perfect, it's much easier to change a line of code than the tooling at the manufacturing plant. | |
| ▲ | Reason077 7 days ago | parent | prev [-] | | Why use two sensors when one will do? If you already have an angle sensor, it makes sense to get rid of the reed switch and reduce your production costs. |
| |
| ▲ | 7 days ago | parent | prev [-] | | [deleted] |
|
|
| ▲ | Reason077 7 days ago | parent | prev | next [-] |
| It can’t be exclusively for Desk View. Desk View only works on Macs with wide-angle cameras, which were introduced in 2024 and 2025 models. But this sensor has been in MacBooks since the 2019 models. |
| |
| ▲ | appellations 7 days ago | parent | next [-] | | Apple has a history of adding sensors, security chips, etc. a few revisions before the feature they support launches. It’s a really good idea because it helps them sort out the supply chain, reliability, drivers, etc. without any customer impact. It decouples the risks of the hardware project from the risks of the software project. If things go particularly well you get to launch the feature on multiple hardware revisions at once because the first deployment of the component worked great, which is a neat trick. | | |
| ▲ | Hamuko 6 days ago | parent [-] | | Yeah, my iPhone 11 Pro came with the ultra-wideband chip in late 2019, and before the AirTags were released in early 2021, I believe the only thing it was used was for ordering AirDrop targets by proximity. It was clearly intended for the AirTags from the beginning, but it took about 1.5 years before it actually mattered. |
| |
| ▲ | wklauss 7 days ago | parent | prev | next [-] | | At Apple Stores, laptops screens have to be opened exactly at 76 degrees. I wonder if they use this sensor and specific software for adjustment (I'm not implying this is the only reason it's there) | | |
| ▲ | simonbw 6 days ago | parent | next [-] | | It seems like it would be much quicker and easier to just have a piece of plastic or something cut at a 76 degree angle that they can place on the laptop and fold the screen up to. | | |
| ▲ | SchemaLoad 6 days ago | parent | next [-] | | Could be that the demo OS reports some metric on how often the laptops are set to 76deg and how often customers move it. Probably a whole ton of usages of the sensor and if it's price comparable to the old close sensor they used to use it would be easy to justify. | |
| ▲ | wklauss 6 days ago | parent | prev | next [-] | | I've heard employees use the measurements app in their iPhones sometimes to adjust in the mornings, but having a sensor in the laptop lid seems like a much easier way to do it and you don't need to carry anything with you. | |
| ▲ | wickedsight 6 days ago | parent | prev [-] | | It would not, since you don't want to carry a piece of plastic all day long to set the angle correctly. Most people just use their phones to check the angle though. |
| |
| ▲ | stevage 6 days ago | parent | prev [-] | | 76 degrees is just an aesthetic choice? | | |
| ▲ | wklauss 6 days ago | parent | next [-] | | I'm assuming so. Apparently it's an angle that "invites" people to use the computers, but I don't think there's anything specific about 76 degrees that makes it better than, say, 73 or 82. As long as you can see the content from an average height, it should work. Most likely they just settle on that angle because it looked good to the store team that was staging the first store, measured it, turned out to be 76 and kept it the same across stores since then for consistency. | | |
| ▲ | scratchyone 6 days ago | parent | next [-] | | I believe the rumor is that 76 degrees is slightly uncomfortable enough to look at that it makes you want to adjust the screen, which in turn makes you more likely to try the device. | |
| ▲ | bnj 6 days ago | parent | prev [-] | | Yep this seems like it makes a lot of sense— and adding on, picking a measurement means that all of them can be the same (consistency, as you said)- having variation in the same row would look bad from a distance | | |
| |
| ▲ | isomorphic 6 days ago | parent | prev [-] | | https://www.forbes.com/sites/carminegallo/2012/06/14/why-the... |
|
| |
| ▲ | DSingularity 7 days ago | parent | prev | next [-] | | Shows you how good they are at planning and decomposing features into well scoped hardware and software features which can ship earlier, provide some value, while enabling richer future features. You have to respect them for this because this is how they have always operated. | |
| ▲ | 7 days ago | parent | prev [-] | | [deleted] |
|
|
| ▲ | KeplerBoy 6 days ago | parent | prev | next [-] |
| Fascinating feature! Is it known how they do it? Is it just an image transformation or a full blown AI model using Gaussian Splats or something along those lines? |
|
| ▲ | anal_reactor 7 days ago | parent | prev | next [-] |
| You could calculate the angle from the camera view as long as at least some piece of the MacBook is in view. |
| |
| ▲ | antennafirepla 7 days ago | parent | next [-] | | You could, for orders of magnitude more compute than reading a magnetic encoder (my assumption at how they estimate it) | | |
| ▲ | estimator7292 7 days ago | parent | next [-] | | Sure, but not more than what you're already spending on transforming the image. And it's not like these devices are exactly lacking in horsepower. | | |
| ▲ | 3eb7988a1663 7 days ago | parent [-] | | This is trivially broken by people who affix some type of cover over the camera. I do this on the off chance some errant application thinks it deserves to take pictures of my environment. | | |
| ▲ | yonatan8070 7 days ago | parent [-] | | If someone covers the camera, the feature isn't relevant since it requires the camera to see your desk | | |
| ▲ | kazinator 7 days ago | parent | next [-] | | Isn't the desktop view is produced from the iPhone camera capture, not from from the MacBook's camera? | | |
| ▲ | empressplay 7 days ago | parent [-] | | If you have a new Macbook the built-in camera does it | | |
| ▲ | kazinator 6 days ago | parent [-] | | I'm typing on a 2024 Macbook Pro. Is that sufficiently new? I don't see how it would work, practically. The only camera is the user-facing one. If the screen were tilted down toward the desk, I'd have to kneel down to see it. |
|
| |
| ▲ | 7 days ago | parent | prev [-] | | [deleted] |
|
|
| |
| ▲ | Cthulhu_ 7 days ago | parent | prev | next [-] | | But compute is cheaper for the manufacturer than adding a sensor (parts & labor, and it adds up over millions). Someone must've done the math. | |
| ▲ | 7 days ago | parent | prev [-] | | [deleted] |
| |
| ▲ | gcanyon 6 days ago | parent | prev | next [-] | | The Mac camera light is wired inline. If the camera is on, so is the light. Since we're not seeing the camera light flashing on periodically, this isn't how it's being done. | | |
| ▲ | danhau 6 days ago | parent [-] | | The Macbook tally light isn‘t necessarily wired to the camera. It very well could be independently software controlled. At least it was not too long ago. IIRC there was an article about this, posted here on HN. Macs used to have (still have?) a feature where you could declare it as lost/stolen and remotely take a photo with the camera. I believe the light didn‘t glow for that. |
| |
| ▲ | sannysanoff 7 days ago | parent | prev | next [-] | | shameless plug: https://sannysanoff.github.io/whiteboard/ not only for mac users. | |
| ▲ | 7 days ago | parent | prev | next [-] | | [deleted] | |
| ▲ | ivanjermakov 7 days ago | parent | prev | next [-] | | Relevant XKCD: https://xkcd.com/1425/ | | |
| ▲ | junon 7 days ago | parent [-] | | This was correct a number of years ago. Feels a little strange we can just do an API call for bird recognition now. | | |
| ▲ | djhn 7 days ago | parent | next [-] | | But is there actually an API for that? Last I checked the big providers Video Intelligence APIs even distinguishing cats and dogs was still unreliable. | | |
| ▲ | junon 6 days ago | parent | next [-] | | Just to see if a bird is in the picture (like the comic states) using chatgpt et al can probably do a sufficient job. Not condoning people make this app, just thinking about how fast things have moved in just a few short years. | | |
| ▲ | JustExAWS 5 days ago | parent [-] | | For a POC, I’ve done animal recognition in a picture with Anthropic and the various Amazon Nova models. It’s around 10 lines of code. |
| |
| ▲ | SAI_Peregrinus 6 days ago | parent | prev | next [-] | | BirdNET from the Cornell lab of ornithology provides that api. | | |
| ▲ | filoleg 6 days ago | parent [-] | | Unless I am missing something massive, BirdNET[0] is for identifying birds by sound, not by images. Merlin[1] (also from Cornell Lab of Ornithology), on the other hand, has both image and sound ID. I haven't used either, so I cannot compare the quality of results from Merlin vs. BirdNET for sound ID, but afaik only Merlin has image ID. 0. https://birdnet.cornell.edu/ 1. https://merlin.allaboutbirds.org/ |
| |
| ▲ | reaperducer 6 days ago | parent | prev | next [-] | | https://merlin.allaboutbirds.org/ | |
| ▲ | MaxikCZ 7 days ago | parent | prev [-] | | These days you dont need an api, you can run the stack on tamagochi |
| |
| ▲ | andreareina 6 days ago | parent | prev [-] | | Flickr did it in 2014, same year as the comic. Unfortunately the service is down and they didn't include a screenshot of it working. https://code.flickr.net/2014/10/20/introducing-flickr-park-o... |
|
| |
| ▲ | vaenaes 7 days ago | parent | prev | next [-] | | [dead] | |
| ▲ | Biganon 7 days ago | parent | prev | next [-] | | [flagged] | |
| ▲ | lazide 7 days ago | parent | prev | next [-] | | Ho boy, good luck convincing people it wasn't watching them wank! | |
| ▲ | inetknght 6 days ago | parent | prev [-] | | That sounds like an excuse to enable turning on the camera without turning on the light for it just because no user-software is using it. No thanks. Plenty of users put stickers on their cameras. One simple user trick would break your whole workflow. | | |
| ▲ | gcanyon 6 days ago | parent [-] | | The Mac camera light is wired inline so as to make this impossible. The only way for the camera to be on and the light not is if the light itself is broken. |
|
|
|
| ▲ | 7 days ago | parent | prev | next [-] |
| [deleted] |
|
| ▲ | a1o 6 days ago | parent | prev [-] |
| How does this work? Does it have two cameras? |