bhouston 2 days ago

Nice. The iPads generally measure around ~8% slower than the MacBooks, I guess for cooling reasons. So we should see approximately a 4400 single core Geekbench score for the MacBook series. This is nice.

Single thread MacBook progression on Geekbench:

M1: 2350

M2: 2600

M3: 3100

M4: 3850

M5: 4400 (estimated)

https://browser.geekbench.com/mac-benchmarks

  • eek2121 2 days ago

    Keep in mind that a big part of the huge jump in recent chips was that GB6 added support for SME, and to my knowledge, no app uses SME as of yet. GB5 is a better benchmark for all these chips for this reason.

    The actual IPC increase and perf/clock of these chips excluding SME specific acceleration is MUCH smaller.

    • gilgoomesh a day ago

      I'm not sure what you're talking about. Any app compiled using LLVM 17 (2023) can use SME directly and any app that uses Apple's Accelerate framework automatically takes advantage of SME since iOS 18/macOS 15 last year.

      • saagarjha a day ago

        Most apps do not use Apple’s Accelerate framework.

        • svantana a day ago

          What do you base this on? I use it in all my products and I don't see why any performance-sensitive dev outfit wouldn't at least consider using it.

          • saagarjha a day ago

            Most dev outfits are not performance-sensitive

            • mpweiher a day ago

              The the whole benchmark discussion becomes moot, doesn't it?

              • saagarjha a day ago

                Benchmarking a processor for real-world usage is definitely something you can do.

                • danielheath a day ago

                  Benchmarking a processor for "app written by someone who disregards performance" is something you can do, but it's a bit of a pointless exercise; no processor will ever keep up with developers ability to write slow code.

                  • tsimionescu a day ago

                    An important question people care about is "will my day to day computing improve with the new processor?".

        • gilgoomesh a day ago

          Of course. And these are CPU vector instructions, so the saying "The wider the SIMD, the narrower the audience" applies.

          But ultimately with a benchmark like Geekbench, you're trusting them to pick a weighting. Geekbench 6 is not any different in that regard to Geekbench 5 – it's not going to directly reflect every app you run.

          I was really just pointing out that the idea that "no" apps use SME is wrong and therefore including it does not invalidate anything – it very well could speed up your apps, depending on what you use.

    • bhouston 2 days ago

      I've benchmarked these myself on things like my project's build time on M1, M2 and M3 and I did see similar gains. So I disagree from experience.

    • aurareturn 2 days ago

      SME is just the AMX coprocessor that’s been in Apple chips since 2019. SME made it easier to target the AMX. But it’s been in use and available to developers since 2019.

      • sgerenser a day ago

        Similar, but not the same. SME is much more powerful than AMX on the pre-M4 cores, and software can target it directly instead of using Apples frameworks. Which means that software is more likely to actually use it (eventually), even if hardly anything does now.

      • wmf 2 days ago

        The point stands that virtually no apps used AMX (either directly or through a framework).

        • GeekyBear 2 days ago

          > The point stands that virtually no apps used AMX (either directly or through a framework).

          AMX has been present in every M series chip and the A series chips starting with the A13. If you are comparing M series chip scores in Geekbench 6 they are all using it, not just the latest ones.

          Any app using Apple's Accelerate framework will take advantage of it.

          • sgerenser a day ago

            This isn’t true, I don’t believe GeekBench ever made use of AMX. They do use SME on any Arm-based platform that has it, which up until extremely recently has only been Apple.

        • tcdent a day ago

          Third-party apps sure, but you can guarantee Apple is taking advantage of it fully with any of their work.

          • astafrig a day ago

            Including in authoring all their other frameworks, which are used by many, many apps.

  • Choco31415 2 days ago

    Here’s the multi core Geekbench progression:

    M1: 8350

    M2: 9700

    M3: 11650

    M4: 14600

    M5: 16650 (estimated)

    This is assuming an 8% uplift as mentioned. Also nice.

    • AnthonyMouse 2 days ago

      I wish we could get something other than Geekbench for these things, since Geekbench seems to be trash. For example, it has the Ryzen 7 7700X with a higher multi-core score than the Epyc 9534 even though they're both Zen4 and the latter has 8 times as many cores and is significantly faster on threaded workloads in real life.

      • wtallis a day ago

        There's real value in having a multi-threaded benchmark that doesn't ignore Amdahl's Law and pretend that everything is embarrassingly parallel.

        • AnthonyMouse a day ago

          That's what the single thread score is supposed to be for. The multi-thread score is supposed to tell you how the thing performs on the many real workloads that are embarrassingly parallel.

          Suppose I'm trying to decide whether to buy a 32-core system with a lower base clock or a 24-core system with a higher base clock. What good is it to tell me that both of them are the same speed as the 8-core system because they have the same boost clock and the "multi-core" benchmark doesn't actually use most of the cores?

          • thyristan a day ago

            The only valid benchmark for that is to use the application you intend to use as a benchmark. Even embarassingly parallel problems can have different characteristics depending on their use of memory and caches and the thermal characteristics of the CPU. Something that uses only L1 cache and registers will probably scale almost linearly in the number of cores, except for thermal influences. Something that uses L2, L3 caches or even main memory will be sublinear.

            • AnthonyMouse a day ago

              You're essentially just arguing that all general-purpose benchmarks are worthless because your application could be different.

              Suppose I run many different kinds of applications and am just looking for an overall score to provide a general idea of how two machines compare with one another. That's supposed to be the purpose of these benchmarks, isn't it? But this one seems to be unusually useless at distinguishing between various machines with more than a small number of cores.

              Your analysis is also incorrect for many of these systems. Each core may have its own L2 cache and each core complex may have its own L3, so systems with more core complexes don't inherently have more contention for caches because they also have more caches. Likewise, systems with more cores often also have more memory bandwidth, so the amount of bandwidth per core isn't inherently less than it is in systems with fewer cores, and in some cases it's actually more, e.g. a HEDT processor may have twice as many cores but four times as many memory channels.

              • thyristan a day ago

                General-purpose benchmarks aren't worthless. They can be used to predict, in very broad strokes, what application performance might be. Especially if you don't really know what the applications would be, or if it is too tedious to use real application benchmarks.

                But in your example, deciding between 24 cores with somewhat higher frequency or 32 cores with somewhat lower frequency based on some general-purpose benchmark is essentially pointless. The difference will be small enough that only the real application benchmark can tell you what you need to know. A general purpose benchmark will be no better than a coin toss, because the exact workings of the benchmark, the weightings of it's components into a score and the exact hardware you are running on will have interactions that will determine the decision to a far greater amount. You are right that there could be shared or separate caches, shared or separate memory channels. The benchmark might exercise those, or it might not. It might heat certain parts of the die more than others. It might just be the epitome of embarassingly parallel benchmarks, BogoMIPS, which is a loop executing NOPs. The predictive value of the general purpose benchmark is nil in those cases. The variability from the benchmark maker's choices will always necessarily introduce a bias and therefore a measurement uncertainty. And what you are trying to measure is usually smaller than that uncertainty. Therefore: No better than a coin toss.

                • AnthonyMouse 16 hours ago

                  You're just back to arguing that general purpose benchmarks are worthless again. Yes, they're not as applicable to the performance of a specific application as testing that application in particular, but you don't always have a specific application in mind. Many systems run a wide variety of different applications.

                  And a benchmark can then provide a reasonable cross-section of different applications. Or it can yield scores that don't reflect real-world performance differences, implying that it's poorly designed.

      • Aurornis a day ago

        The trick with GeekBench is to scroll down and look at the specific sub-benchmarks that are most relevant to you.

        • AnthonyMouse a day ago

          I attempted to do this and discovered an irregularity.

          Many of the systems claiming to have that CPU were actually VMs assigned random numbers of cores less than all of them. Moreover, VMs can list any CPU they want as long as the underlying hardware supports the same set of instructions, so unknown numbers of them could have been running on different physical hardware, including on systems that e.g. use Zen4c instead of Zen4 since they provide the same set of instructions.

          If they're just taking all of those submissions and averaging them to get a combined score it's no wonder the results are nonsense. And VMs can claim to be non-server CPUs too:

          https://browser.geekbench.com/v6/cpu/search?utf8=%E2%9C%93&q...

          Are they actually averaging these into the results they show everyone?

      • ZiiS a day ago

        Any benchmark useful to cross compare single user desktop/laptop experience is going to be useless in the datacentre; and vice versa.

      • a-french-anon a day ago

        Yeah, a simple SPECint or builtin Python benchmarks would be way more interesting that a proprietary "benchmark" with mystery tasks.

      • renewiltord a day ago

        Just use xmrig. Smashes all cores.

    • nodesocket 2 days ago

      For reference I have a M4 Pro mac mini, top spec model with 14 cores and score:

        single: 3960
        multi: 22521
      • bhouston 2 days ago

        I think he is showing the base cpu comparison for the MacMini/MacBooks. There are so many M-series multicore variants it is hard to mention them all.

    • bhouston 2 days ago

      Will the base core count and mix between perf and efficient cores remain the same? That has lead to different scaling factors for the multicore performance than the single core metrics.

      • zamadatix 2 days ago

        Possibly, at least compared to the previous M4 generation. For the lowest tier M models to this point:

          M1 (any):  4P + 4E
          M2 (any):  4P + 4E
          M3 (any):  4P + 4E
          M4 (iPad): 3P + 6E
          M4 (Mac):  4P + 6E
          M5 (iPad): 3P + 6E (claimed)
          M5 (Mac):  Unknown
        
        It's worth noting there are often higher tier models that still don't earn the "Pro" moniker. E.g. there is a 4P + 8E variant of the iMac which is still marketed as just having a normal M4.
        • AtlasBarfed a day ago

          Are these cores getting way more complex? Because there should be room for 2x - 3x as many cores with die shrinks at this point.

          • zamadatix 20 hours ago

            The die shrinks are less than the marketing numbers would make you believe, but the cores are getting significantly more complex. I think E cores had a 50% cache increase this generation, as an example.

            The above summary also excludes the GPU, which seems to have gotten the most attention this generation (~+30%, even more in AI workloads).

          • astrange a day ago

            If you get more space, you need a really good reason not to use that space on more cache.

            Also, the size numbers are lies and aren't the actual size of anything.

  • porphyra 2 days ago

    Nice. Lots of people still claim that M1 is super duper fast but now we're at almost twice the performance!

    • mpalmer 2 days ago

      M1 is still fast, the speed of its successors does not change that.

      • steve_adams_86 2 days ago

        I use one for software development and it's great. Sometimes rust builds are slow and I'd love to force that to be faster with hardware (optimizing build time would be a huge undertaking with not-so-great returns), otherwise I'm totally content. I also have an M2 Max with 32GB of RAM that still feels like magic. I've never had computers that felt so fast for so long.

        • necovek a day ago

          I have i7-4790k that still seems to score almost 1500 on single core benchmark: that thing felt fast for decade+.

          • biinjo a day ago

            But it burns a hole in your lap while doing so probably.

            • pmontra a day ago

              Most laptops sit on a desk and never see a lap, even Macs.

              Benchmarking how long people could sit with a laptop on their lap while running Geekbench could be an interesting metric though.

              • Xenoamorphous a day ago

                I can't even remember PCs now (been 10+ years on Macs) but heat is still an issue (especially in summer in hot climates) if the thing is going to throttle.

                • necovek 5 hours ago

                  Heat is an issue with Macs too: if it wasn't, you'd have Air chassis with the performance of M4 Max/Ultra.

                  Yes, they've done some nice things to get the performance and energy efficiency up, but it's not like they've got some magic bullet either. From what I've seen in reviews, Intel is not so far off with things like Ultra 7 258V. If they caught up to TSCM on the process node, they would probably match Apple too.

                • pmontra a day ago

                  I've got a laptop of the times when you switched to a Mac. It's warm in winter, which is nice, but not so warm to be a problem in summer. My workloads are mild, Django and Rails. Even the test suites are not CPU bound. Linux, not Windows.

            • necovek a day ago

              It's an old desktop built in 2014 or 2015 that I never used on my lap.

              This was a reply to "never having had a computer that felt fast for so long".

              For some tasks, this CPU with the GTX 970 still feels faster than MacBook M2 or recent Ryzen laptop APU.

            • 71bw a day ago

              And yet, it allows the user to experience true freedom unlike Apple's offerings.

              • biinjo a day ago

                Care to elaborate on this? I get that iOS is a closed system but I’ve never felt limited by Apple in what I can do with my Macbook.

                • 71bw a day ago

                  There is no decent option on an alternative operating system. I do not like macOS and its many quirks, especially its extremely gatekept nature, including the system being SURE that what IT wants is the best for me; I understand that this might be an approach that some people prefer, but in my case it's the equivalent of showing a bull a red cloth.

                  I will say this - and most will not like this - that I'd go out and buy a M* MacBook if they still kept Boot Camp around and let me install Windows 11 ARM on it. I've heard Linux is pretty OK nowadays, but I have some... ideological differences with the staff behind Asahi and it is still a wonky hack that Apple can put their foot down on any day.

                  • thenthenthen a day ago

                    Why not virtualise (?) with Parallels? Runs like a beast (M1 mba).

                    • 71bw a day ago

                      Because I'd prefer to run bare metal and, if I recall correctly, macOS still hogs up 50+ GB on a clean install - and there is no GPU acceleration(? - might be wrong here).

                • steve_adams_86 a day ago

                  I guess the hardware is extremely locked down. That part is a drag. Application sharing limitations (needing to publish to the app store, more or less) still feels wrong after all these years. There's more but those are the ones that bother me with any frequency

                  • biinjo a day ago

                    Need to publish to the app store? I have precisely zero app store applications installed. All just downloaded DMGs from the vendor’s website.

        • asa400 a day ago

          Same here. I have an M1 Max with 64GB and the only time I notice a slowdown is doing Rust release builds with `lto = true` and `codegen-units = 1`, which makes complete sense. Otherwise there is _plenty_ of multicore performance for debug builds, webdev, web browsing, etc., often all at once.

          • necovek a day ago

            Fill out your SSD to 90% or more (and keep it there), and you'll understand how quickly things can go south.

            • moi2388 a day ago

              Why would I ever do that?

              • necovek 7 hours ago

                You paid for that capacity to be able to use it?

                • moi2388 6 hours ago

                  I pay so that I only fill up to 50% usually

        • pylotlight 2 days ago

          I dunno, working with M1 daily I struggle with resource contention and slow py/js builds. I'd love something faster when work provides me with updated device.

          • fouc a day ago

            you could try moving your dev environment & docker into a linux VM.. could do something similar to this: https://medium.com/carvago-development/my-docker-on-macos-pa...

            I mention this because I'm guessing you might be using Desktop Docker which is kinda slow.

            • noname120 a day ago

              Or maybe just switch to OrbStack: https://orbstack.dev/

              • hombre_fatal a day ago

                Won’t change how slow it is to run docker containers in VMs on macOS.

                • fouc a day ago

                  docker definitely runs faster inside linux running in a VM on macOS. funny how that works given the overhead on running a VM, but it seems running on a linux & ext4fs interfaces give it quite a performance boost.

                • noname120 a day ago

                  OrbStack’s docker and VMs are lightning fast.

          • sudoshred a day ago

            My m1 16gb pro gets throttled (and manually restarted) every time vs code hits 100gb ram usage when running jest unit tests. Don’t know who to blame but 99% sure the problem is my source code and not the machine.

            • raddan a day ago

              You’re chewing through 100GB of RAM on unit tests? What is filling your RAM? Are you sure that you don’t have a memory leak somewhere?

              • sudoshred 9 hours ago

                I’m certain it’s a memory leak, just not sure solving the problem is worth my time.

            • gigatexal a day ago

              Chrome eats all my resources too and I have 128GB of ram. Some pages like ClickUp and the Google cloud console (bigquery) eat something like 3-5GB

            • kjkjadksj a day ago

              100gb ram usage? Why not build a pc?

              • necovek a day ago

                Considering he's got a 16GB of RAM, this is virtual memory, and most programs don't need to keep it all loaded at all times (if they do, you'll notice with constant paging exception handling).

                • noname120 a day ago

                  > My m1 16gb pro gets throttled (and manually restarted) every time

          • dontdoxxme 2 days ago

            Sounds like you need to spend some time optimising your build. Faster hardware just makes developers lazy. I'm still on an M1 and it's fine, although I do have 32GB.

            • mrheosuper a day ago

              Could you clarify more, AFAIK there is no M1 with 32gb ram machine.

              • flkenosad a day ago

                M1 Pro

                • mrheosuper a day ago

                  Well it's quite different from the M1. Please don't mix it. It like saying i3 and i7 is the same if they are on the same generation.

                  • astrange a day ago

                    It's not that different. It has more cores but they're the same cores.

                    • mrheosuper a day ago

                      And higher power limit, exactly like i3 and i5 and i7. Same core architecture, just different in clock, power and core count.

                    • saagarjha a day ago

                      I think they have the same microarchitecture but are slightly different (in, like, the number of address bits)

            • dangus a day ago

              To be fair, for all you know they’re building something pretty crazy that justifies high resource consumption.

              Faster hardware doesn’t exclusively make developers lazy, it also opens up capability.

              • steve_adams_86 a day ago

                That's the thing. It's not a trivial project. I've already put a lot of time into optimizations and the returns are diminishing now.

                • dangus a day ago

                  In the age of AI it seems wild to blame developers for being “lazy” and needing more resources.

                  Like, if I were buying a new workstation right now, I’d want to be shelling out $2000 so that I could get something like a Ryzen AI 395+ with 128GB of fast RAM for local AI, or an equivalent Mac Studio.

                  That’s definitely not because I’m “lazy,” it’s because I can’t run a decent model on a raspberry pi

          • michelb a day ago

            OSX file I/O is really, really slow with large amounts of small files. Wondering if Apple will ever fix that.

      • rafaelmn a day ago

        Not really - I bought an M4 Air to check how my dev ecosystem (.NET) would run on the ARM/Apple silicon and while its usable its noticeably slowing me down in my day to day with Rider. I will be getting the M5 pro because the performance is a bottleneck for me (even incremental builds take a while). Also I regret not getting at least 30gb ram (Amazon only had 24gb configurations) because with docker running I constantly hit the memory pressure thresholds .

        Which is not to say that the Air is a bad device, its an amazing laptop (especially for the price, I have not seen a single Windows laptop with this build quality even at 2x price) and the performance is good - that if I was doing something like VSCode and node/frontend only it would be more than enough.

        But also people here oversell its capabilities, if you need anything more CPU/Memory intensive PRO is a must, and the "Apple needs less ram because of the fast IO/memory" argument is a myth.

        • saagarjha a day ago

          Have you checked what is slowing it down?

          • rafaelmn 21 hours ago

            Just everything syphoning down RAM, standard office/desktop stuff like Slack, Chrome, Mail, Calendar, Messages, a few IMs will easily eat up over 12 GB, then add in VSCode, Rider, Docker (which I cap at 2 GB ram) - I am swapping gigabytes when I have multiple tools running.

            But even when I kill all processes and just run a build you can see lack of cores slow the build down enough that it is noticeable. Investing into a 48gb ram/pro version will definitely be worth it for the improved experience, I can get by in the meantime by working more on my desktop workstation.

          • tonyedgecombe a day ago

            I wonder if it’s Docker as it runs in a VM on macOS.

            • rafaelmn 21 hours ago

              You can control how much RAM gets allocated to the Docker VM and I keep that at 2GB which is not that much. I am not running stuff inside docker locally just using it to boot up stuff like pg/rmq/redis

    • matwood a day ago

      I bought an M1 Max with 64GB of ram on the gamble it would last awhile because future M series would be evolutionary not revolutionary again. It seems like it's paid off since there's never a time it feels slow. I may end up upgrading to the M5 just to go down in screen size because I travel more than I did when I bought the 16" one.

      • yesnomaybe 4 hours ago

        Have an M1 Pro 32GB which recently started feeling slower. VSCode multiple tabs is a problem. Generally the UI feels less snappy.

        I've switched now to a desktop Linux, using an 8C/16T AMD Ryzen 7 9700X with 64GB. it's like night and day. but it is software related. Apple just slows everything down with their animations and UI patterns. Probably to nudge people to acquire faster newer hardware.

        The change to Linux is a change in lifestyle, but it comes with a lot of freedom and options.

      • raydev 15 hours ago

        Same. My M1 Max only shows it age when building a few massive C++ projects, the M4 cuts down compile times to a quarter so I feel like I'm missing out there, but not enough to put down another$4k yet.

        Compared to every Intel MBP I went through, where they would show their age after about 2-3 years, and every action/compile required more and more fans and throttling, the M1 is still a magical processor.

      • sgt101 20 hours ago

        I had a 16' m1 pro until July when work had an M3 with 2GB more ram going free and I thought it would be smart to take it, in the past when I had had a three+ year old machine and upgraded everything was super better.. but this time I don't notice any difference at all in almost any application.

        The only place I feel it is when I am running a local llm - I do get appreciably more tokens per second.

    • para_parolu a day ago

      I use m1 max and m4 max every day. Don’t see difference in speed at all. Regular tasks are equally fast. Rust compiler is equally slow

    • gpm a day ago

      I claim that the M1 (macbook air) is fast. I also claim it's about half the speed of my similarly priced desktop of the same vintage for slow tasks that I care about.

      So I guess we've caught up with the desktop now.

      Actually I assume we caught up awhile ago if I used the beefy multi core MX-Ultra variants they released, really just the base model has caught up. On the other hand I could have spent four times as much for twice as many cores on my desktop as well.

      • Panzer04 a day ago

        You can buy laptops with essentially identical CPU performance to desktop plugged in nowadays, so long as you aren't thinking of thread ripper.

        On the move laptops will always be a bit slow because all the tricks to save idle usage don't help much when you're actually putting them to work.

    • tedk-42 2 days ago

      A car that travels 0-60 MPH in 2 seconds is fast

      A car that does this in 4 seconds is still fast (though twice as slow)

      • onionisafruit 2 days ago

        If they both top out at 60 they are equally fast. One just had better acceleration than the other

        • sib a day ago

          Yes - typically the (US English, at least) jargon is "quick" for acceleration and "fast" for top speed...

          • olyjohn 19 hours ago

            Nah not really. When you watch drag racing, they're testing acceleration. One car is always faster. Nobody says one is quicker than the other. Quick (when referring to straight line speed) is reserved for cars like the Miata, which has decent acceleration, but certainly can't accelerate like a muscle car. Nobody really compares top speed all that much because it's damn near impossible to hit top speed in a lot of cars, even on a race track. You will find slower cars comparing top speed though. Like an MG Midget, or an early Honda Civic might be able to hit 100mph, but that's an easily attainable speed. Fast cars are just faster than quick cars.

        • dzhiurgis a day ago

          Top speed hardly matters for cars. Acceleration is where fun is. Not sure this applies to CPUs at all.

          • pmontra a day ago

            Cornering speed is where real fun is. And danger.

          • hbogert a day ago

            Acceleration is actually a thing. The CPU needs to ramp up cycles fast if it wants to feel snappy. It needs to wind down as soon as no significant workload is there anymore. All needed for good efficiency

          • jychang a day ago

            Any cpu can run at 100ghz for 1 cycle. The trick is getting it to run at 100ghz forever

            • saagarjha a day ago

              I think the speed of light would like to have a word with you.

      • mrheosuper a day ago

        The first car is much more fun to ride

        • NetMageSCW a day ago

          Only if all you care about is a straight line.

          • mrheosuper a day ago

            I assume everything else is the same, or else it will be apple and orange.

    • jackothy 2 days ago

      What is that, 6 years for less than a doubling? Nothing against the Apple chips themselves, but gone are the days of Moore...

      • danhon a day ago

        Moore was just about the transistor count doubling. Not performance.

        • jackothy 18 hours ago

          Performance is an excellent proxy for transistor count. Somebody else replied with the actual transistor counts, which had basically equal scaling.

      • epicureanideal a day ago

        Not necessarily. If single core speed doubled but we also doubled or tripled the number of cores, doesn’t that count as keeping up with Moore?

        • jl6 a day ago

          M1 has 16bn transistors, M4 has 28bn. Increasing the core count is useful for some applications (particularly GPU cores), but there are still many critical workloads that are gated by single-threaded performance.

          • CyberDildonics 21 hours ago

            That's not what was being talked about.

            Moore's law was never about single threaded performance, it was about transistor count and transistor cost, but people misunderstood it when single threaded performance was increasing exponentially.

            • jl6 19 hours ago

              GP mentions cores and the article is about benchmarks. The relationship between transistor count, core count, and performance, is exactly on topic.

    • dzhiurgis a day ago

      I feel no need to upgrade, especially now that you can have applecare yearly subscription.

    • j45 2 days ago

      M1 is still fast for a lot, more so when it's cooled.

  • TrainedMonkey 2 days ago

    I think we will land around 4300. On paper N3P is 5-10% more transistors and 5-10% more efficiency. Naively this puts the perf lift range from 10.25% (1.05 * 1.05) to 21% (1.10 * 1.10). 4300 would be around halfway estimate. Gains in iPhone and iPads perf are heavily supported by new cooling tech. When compared to iPhones and iPads, Macbook Pros are much less thermally constrained and M5 Macbooks are keeping M4 design.

    TL:DR I expect a smaller MBP M4 to M5 pop compared to iPads M4 vs M5 because the latter are benefiting from new cooling tech.

  • Reason077 2 days ago

    Here's the comparison between the M4 iPad Pro and the M5:

    https://browser.geekbench.com/v6/cpu/compare/14173685?baseli...

    About 10% faster for single-core and 16% faster for multi-core compared to the M4. The iPad M5 has the same number of cores and the same clock speed as the M4, but has increased the RAM from 8GB to 12GB.

    • sgerenser a day ago

      M4 iPads with the top two storage tiers (1 or 2 TB) already come with 12 GB. Maybe they’re changing to all having 12GB though, who knows.

      • mrheosuper a day ago

        All the pro iphone has 12gb of ram now, so it's safe to assume all pro ipad will have the same amount of ram.

      • AtlasBarfed a day ago

        Are macs still committing highway robbery in storage and ram costs?

  • dwd a day ago

    Between personal and home, I've had a:

    M1 (16Gb)

    M1 Pro (16Gb)

    M2 Pro (16Gb)

    M3 Pro (32Gb)

    M4 Air (24Gb)

    Currently switch between the M2 Pro and the M4 Air, and the Air is noticeable snappier in everyday tasks. The 17' M3 Pro is the faster machine, but I prefer not to lug it around all day, so it gets left home and occasionally used by the wife.

    • ulfw a day ago

      What is a 17' M3 Pro?

      • bigiain a day ago

        A 17 foot tall MacBook, which was Steve's original vision sketched on a napkin.

        Not the 17 inch tall one they built and put on stage for the MacWorld keynote speech - that was in danger of being trod upon by a dwarf.

        • bigyabai 12 hours ago

          Was a great speech nonetheless. Shame about what happened to the drummer though. Best leave it unsolved, really.

      • filoleg a day ago

        Almost certain it is just a 16in M3 Pro, but mistyped.

        • ulfw 8 hours ago

          that's some serious mistyping from 17 feet to 16 inches

  • gigatexal a day ago

    Amazing. Thanks for the comparison to the older gen. Here’s hoping the chips they put in the Airs can get more than 32 maybe even 64GB of ram

stego-tech 2 days ago

Damn impressive if true, but I’ll be honest, I still don’t feel the need to replace my M1 iPad Pro or my M1 Pro Macbook Pro. They’re both more than amazing for my use cases - though if Apple suddenly took gaming seriously, like extending Rosetta to act as a translation layer for Windows games a la Proton, I’d gladly throw down for an M5 Ultra when it’s released.

  • ascagnel_ 2 days ago

    > though if Apple suddenly took gaming seriously, like extending Rosetta to act as a translation layer for Windows games a la Proton, I’d gladly throw down for an M5 Ultra when it’s released

    No joke, if I could run my Steam library on my phone, I'd probably buy a new phone every year (and might need to, given what the thermals and rapid charge/discharge cycles would do to battery longevity). But Apple's current strategy is to provide a tool, then let developers do the work themselves; compare to Valve's efforts (and occasionally stepping on rakes when games update themselves).

    • jeroenhd a day ago

      There have been credible rumors that Valve is experimenting with an ARM-based SteamOS for the next Steam Deck which would bring quite a lot of games to mobile.

      You can already do this with tools like Winlator of course, but Valve's performance patches would probably make the whole process a lot easier to get working easily.

      Any such feature would come to Apple hardware last because of Apple's arbitrary software limitations (maybe it'll work in the EU?), of course, but once Proton goes full ARM, it's only a matter of time.

    • hahayup 2 days ago

      I completely agree. I've worked on the Whisky project and am fairly familiar with WINE, Proton, and CrossOver. Proton uses the underlying technology of WINE, which is largely supported by CodeWeavers, who mainly create CrossOver, essentially Proton for MacOS. CodeWeavers also supported the creation of GPTK, which uses WINE as well. CrossOver, considering it's live-translating Windows x86 calls to MacOS x86 calls, and then piping that through Rosetta, works INCREDIBLY well on my M1 Pro. Elden Ring was easily out-performing there than on my Steam Deck.

      As far as I can tell, the main issue for putting CrossOver on iOS is a lack of API support and an inability for iOS software to start new processes. AltStore and emulators on iOS are exciting, and with iPadOS and MacOS becoming increasingly similar, I hope to see someone give WINE on iOS a shot.

      I would love to see a world where I can play my Steam library on my iPad or my iPhone, considering the wild amount of performance they can output, but the limitations of iOS make it very difficult or likely impossible.

      • whatever1 2 days ago

        Apple does not take 30% cut from this hence this will never happen.

        • NetMageSCW a day ago

          Apple certainly allows a lot of free without in-app purchases apps on the App Store.

          They don’t care that much about the 30% (which was the best deal ever for a phone app store when it came out) except how the App Store sells hardware.

          • planb a day ago

            This is simply not true anymore. I really believe when the App Store started, they did not see it as a profit center, but in the last 10 years, the bean counters took over.

    • girvo a day ago

      So this doesn't help for iOS, but through the power of WINE and other cool tools, you actually can: https://gamehub.xiaoji.com/

      These are especially great on the various Android based gaming handhelds that are out now with Snapdragon 8 Gen 3 and similar SoC's in them, but it works on a phone too

    • jackothy 2 days ago

      The problem is they don't "let the developers do the work themselves"

      If only the platform was open enough that developers had real access, Apple might get away with like you say not providing first party support for gaming.

      • ascagnel_ 2 days ago

        I mean it more as a counterpoint to what Valve is doing: games are automatically opted-in to working with Proton and pass through a Valve-operated certification program. Because developers aren't part of that loop, you see occasional cases where games issue updates that break compatibility, with users blaming developers based on an outdated, Valve-provided certification.

    • NetMageSCW a day ago

      Apple will replace your phone battery relatively inexpensively.

  • GeekyBear 2 days ago

    > if Apple suddenly took gaming seriously, like extending Rosetta to act as a translation layer for Windows games a la Proton

    Apple already provides the translation layer to convert from DirectX 11 or 12 to Metal that Wine uses on Macs.

    https://wccftech.com/apple-game-porting-toolkit-2-supports-a...

    Proton does the exact same thing, only it translates DirectX into the Graphics API that Wine on Linux uses.

    The new thing is that the M5 versions of the GPU cores picked up a 40% performance boost, on the version that just shipped on the new iPhones.

  • wing-_-nuts 2 days ago

    In all honesty, I don't think cpu has ever been a huge limitation for me outside of gaming. The biggest bottlenecks for me have always been disk speed and memory. My soon-to-be decade old xps 13 gets on well enough, except it only has 8gb of soldered on ram. That absolutely is a bottle neck for me.

    • umanwizard a day ago

      I guess you don’t work in C++ or Rust. Compile times are a bitch and they’re completely CPU-bound.

      • wing-_-nuts 21 hours ago

        Bear in mind, my personal laptop rarely compiles massive projects. I think the last large thing I compiled by hand was wine, and it's been a while since I've done that. Most of the coding done on that laptop are small toy applications to test a language feature. It also helps that golang is lightning fast when it comes to compile times.

      • mrheosuper a day ago

        How often do you rebuild your C++/Rust? You may consider adding some cache system.

  • willis936 2 days ago

    I know Asahi is in choppy waters this year, but it is a seriously impressive project in (imo) a good state. I was surprised that I could run a one-liner script and an hour later play a 32-bit windows x86 3D openGL game on an ARM apple machine with reasonable performance.

    • GeekyBear 2 days ago

      Asahi isn't in choppy waters.

      The new project leadership team made the decision to prioritize getting their existing work upstreamed into the Linux kernel, before working on supporting newer SOCs.

      It's been going well.

      • willis936 2 days ago

        Oh, has Linus clarified Linux's position on rust patches in the kernel? I hadn't heard.

        • norman784 a day ago

          From their blog [0]

          > With Linux 6.16 now out in the wild, it’s time for yet another progress report! As we mentioned last time, the Asahi and Honeykrisp Mesa drivers have finally found their way upstream. This has resulted in a flurry of GPU-related work, so let’s start there.

          Seems that they are doing pretty well upstreaming their work.

          - [0] https://asahilinux.org/2025/08/progress-report-6-16/

        • nixosbestos a day ago

          Rust has been in the kernel for sometime now? With 6.18 to include significantly more including arguably the single most important component of Android, binder, entirely re-written in Rust.

          • willis936 a day ago

            Yes, it has. Marcan left Asahi and Linux after the DMA maintainer rejected their patch on non-technical grounds and labeled Rust as cancer for maintainers. So Linus supports Rust in the kernel, Rust patches come in, some maintainers reject them, there is no guidance on where or how Rust will or will not be accepted. Just do a bunch of free work and roll the gatekeeping dice. Very healthy and not abusive.

            • GeekyBear a day ago

              > there is no guidance on where or how Rust will or will not be accepted

              Guidance was provided...

              > Maintainers who want to be involved in the Rust side can be involved in it, and by being involved with it, they will have some say in what the Rust bindings look like. They basically become the maintainers of the Rust interfaces too.

              But maintainers who are taking the "I don't want to deal with Rust" option also then basically will obviously not have to bother with the Rust bindings - but as a result they also won't have any say on what goes on on the Rust side.

              So when you change the C interfaces, the Rust people will have to deal with the fallout, and will have to fix the Rust bindings. That's kind of the promise here: there's that "wall of protection" around C developers that don't want to deal with Rust issues in the promise that they don't have to deal with Rust.

              But that "wall of protection" basically goes both ways. If you don't want to deal with the Rust code, you get no say on the Rust code.

              Put another way: the "nobody is forced to deal with Rust" does not imply "everybody is allowed to veto any Rust code".

              https://lore.kernel.org/lkml/CAHk-=wgLbz1Bm8QhmJ4dJGSmTuV5w_...

            • lomase a day ago

              They left when Linus wrote that social media shaming won't be tolerated and that the main problem was how they were handling things.

              To get anything done in the Linux mailing list you need iron will.

      • rowanG077 2 days ago

        I agree with that somewhat. But they still lost their main GPU kernel driver developer AND Also their uber productive user space GPU driver developer. With especially the loss of Alyssa leaving a basically unfillable hole.

        • GeekyBear 2 days ago

          Alyssa's work was already upstreamed in May, and the driver is ready to go when the kernel is ready to accept Rust GPU drivers.

          > We are pleased to announce that our graphics driver userspace API (uAPI) has been merged into the Linux kernel. This major milestone allows us to finally enable OpenGL, OpenCL and Vulkan support for Apple Silicon in upstream Mesa. This is the only time a graphics driver’s uAPI has been merged into the kernel independent of the driver itself, which was kindly allowed by the kernel graphics subsystem (DRM) maintainers to facilitate upstream Mesa enablement while the required Rust abstractions make their way upstream. We are grateful for this one-off exception, made possible with close collaboration with the kernel community.

          https://asahilinux.org/2025/05/progress-report-6-15/

          Alyssa didn't abandon the project, she completed it.

          • rowanG077 2 days ago

            Sure for M1/M2 it's somewhat correct that it is "completed". M3 and beyond have a lot of GPU differences though(A completely new microcode afaik), so the loss is still traumatic. I also don't think GPU perf is on the level of OSX though I did not do a recent one to one bench.

            • GeekyBear 2 days ago

              Yes, and as I mentioned earlier, the new leadership team has decided to upstream the existing work before moving on to work on supporting newer chips.

              • rowanG077 a day ago

                yes you said that. That doesn't mean losing two very important team members who both have unique skillset just doesn't have any impact. You really can't just waive that away with "New leadership has decided to upstream the existing work before moving on to work on supporting newer chips". That Alyssa is no longer involved definitely is a huge loss for Asahi, I'm very surprised you can even attempt to deny this.

                • GeekyBear a day ago

                  > That Alyssa is no longer involved definitely is a huge loss for Asahi

                  Are you under the impression that Linux is about to abandon OpenGL and Vulkan in favor of a new graphics API that only Alyssa could possibly implement?

                  • rowanG077 a day ago

                    No. I am under the impression that there is no replacement on the horizon that could take on the challenge of further optimizing the GPU driver and work on M3/M4/M5 GPU support once upstreaming efforts have settled down.

                    • GeekyBear a day ago

                      Alyssa wasn't doing the GPU driver work.

                      Marcan already provided the tools needed to capture the data being passed back and forth between MacOS and the GPU, so you can see exactly what the newer versions of the SOC are doing that is different.

                      • rowanG077 a day ago

                        I think you are confused. There are two drivers here. The kernel space driver mostly developed by lina. And the user space driver developed by Alyssa. Both are drivers. And of the two the user space driver is an order of magnitude more complex.

                        • GeekyBear a day ago

                          Alyssa's userspace work that I've already mentioned has been completed and upstreamed?

                          > We are pleased to announce that our graphics driver userspace API (uAPI) has been merged into the Linux kernel. This major milestone allows us to finally enable OpenGL, OpenCL and Vulkan support for Apple Silicon in upstream Mesa.

                          • rowanG077 a day ago

                            Sure for M1/M2 it's somewhat correct that it is "completed". M3 and beyond have a lot of GPU differences though(A completely new microcode afaik), so the loss is still traumatic. I also don't think GPU perf is on the level of OSX though I did not do a recent one to one bench.

  • tibbon a day ago

    Yup. I've got an M1 Pro from work, and while I'm looking forward to hopefully getting an M5 in December, I really don't know if it will be a significant increase in my daily tasks or enjoyment. I do have an M4 Pro Mini that's a bit snappier, but it isn't night and day difference.

  • carlosjobim a day ago

    Why should you need to replace your M1 machine? Quality products tend to stay with us for a longer time. Apple still has billions of people without any of their devices to sell their new machines to.

  • jmkni 2 days ago

    Yeah they've reversed-cannabalised themselves

    • flkiwi 2 days ago

      I mean, they're still doing pretty well.

  • thenaturalist 2 days ago

    Local LLMs will be one of those things where you'll feel a difference.

    Not much else I can think of as well.

    M1 is still insane. Apps, OS emulation... just chuggs along.

    • adastra22 2 days ago

      Just upgraded my M1 Pro to an M4 Max. Roughly an order of magnitude faster local inference, though I haven’t benchmarked it.

      Outside of that though, it’s really hard to tell the difference. M1 is/was a beast and plenty fast enough for daily work.

      • asa400 a day ago

        I have an M1 Max and I'm curious, do you notice any difference in battery life/power usage?

        • adastra22 a day ago

          Only had it two days. Don’t know yet!

    • addaon 2 days ago

      I use my M1 Max MacBook Pro for pretty serious CFD with OpenFoam. It’s astonishing how good it is… but a newer machine would be nearly 2x faster, which matters when single runs can take 1-3 hours.

    • ziofill 2 days ago

      I don't know how often it happens that a 5 year old chip still gets praise, but I guess not very often.

matdehaast 2 days ago

It feels like Intel and AMD are asleep at the wheel with their mobile lineup. I've been looking at non-apple equivalents that have similar performance/power as the M lineup and it seems they all lag about 20%+.

For $800 the M4 Air just seems like one of the best tech deals around.

  • cosmic_cheese 2 days ago

    The reduced horsepower relative to M-series isn't a problem for me as much as efficiency is. Both Intel and AMD seem to struggle with building a CPU that doesn't guzzle battery without also seriously restricting performance.

    This really sucks. The nice thing about high end (Mx Pro/Max) MBPs is that if you need desktop-like power, it's there, but they can also do a pretty good job pretending to be MacBook Airs and stretch that 100Wh battery far further than is possible with similarly powerful x86 laptops.

    This affects ultraportables too, though. A MacBook Air performs well in bursts and only becomes limited in sustained tasks, but competing laptops don't even do burst very well and still need active cooling to boot.

    On the desktop front I think AMD has been killing it but both companies need to start from scratch for laptops.

    • vient 2 days ago

      > building a CPU that doesn't guzzle battery

      It may be the software problem as well. On Windows I regularly need to find which new app started to eat battery like crazy. Usually it ends up being something third-party related to hardware, like Alienware app constantly making WMI requests (high CPU usage of svchost.exe hosting a WMI provider, disabling Alienware service helped), Intel Killer Wi-Fi software doing something when I did not even know it was installed on my PC (disabling all related services helped), Dell apps doing something, MSI apps doing something... you get the idea.

      It seems like a class of problems which you simply can't have on macOS because of closed ecosystem.

      Without all this stuff my Intel 155H works pretty decently, although I'm sure it is far away from M-series in terms of performance.

      • illusive4080 a day ago

        The Mac ecosystem isn’t as closed as you’re alluding to. You can easily download unsigned binaries and run them. Furthermore, if you’re looking for a battery hog, look no further than Microsoft Defender, Jamf Protect, and Elasticbeat. All 3 of those are forcibly installed on my work laptop and guzzle up CPU and battery.

        • swiftcoder a day ago

          > You can easily download unsigned binaries and run them

          It's definitely becoming less easy over time. First you had to click approve in a dialog box, then you had to right-click -> open -> approve, now you have to attempt (and fail) to run the app -> then go into System Settings -> Security -> Approve.

          I wanted to install a 3rd party kernel extension recently, and I had to reboot into the safety partition, and disable some portion of system integrity protection.

          I don't think we're all that far from MacOS being as locked-down as iOS on the software installation front...

          • olyjohn 19 hours ago

            Yep, they will lock all that down. It's been coming for years. Tech companies have learned to do their anti-consumer work slowly and progressively over time instead of dropping it all at once. The whole frog in boiling water thing...

            Microsoft is working towards this too. They wish so bad that they were Apple.

        • vient a day ago

          > You can easily download unsigned binaries and run them

          Of course, but I assume you don't really need to install third-party apps to control hardware. In my case Alienware and Dell bloat came from me setting up an Alienware monitor. MSI bloat came from setting up MSI GPU. Intel Killer stuff just got automatically installed by Windows Update, it seems.

          > Microsoft Defender

          This one I immediately disable after Windows installation so no problems here :)

          On work we get CrowdStrike Falcon, it seems pretty tame for now. Guess it depends on IT-controlled scan settings though.

          • illusive4080 a day ago

            Re: Microsoft Defender, I’m actually talking about defender on macOS. It is a multi platform product. I hear infosec is pretty happy with it. Me? It uses 100% CPU even when I’m doing nothing. I’m not happy.

            • eppsilon a day ago

              Try some of the steps on this page [1]. In particular, enabling real-time protection stats and then adding exclusions for the processes causing the most file scans can help.

              1. https://learn.microsoft.com/en-us/defender-endpoint/mac-supp...

              • illusive4080 23 minutes ago

                I’m not in control, I’m just a user, but thanks. I have talked to the owners on occasion and plan to keep bringing it up so they can investigate.

          • kalleboo a day ago

            What's mad is that you would have thought that Microsoft would use the Surface devices to show hardware manufacturers what could be done if you put some effort in, but I've heard so many horror stories from Surface owners about driver issues

      • cosmic_cheese 2 days ago

        Windows doesn't do it any favors, for sure. Running Linux with with every tweak under the sun for better life still leaves a large gap between x86 laptops and MacBooks, however, and while there's probably some low hanging optimization to be taken advantage of there I think the real problem is that x86 CPUs just can't idle as low as M-series can, which is exacerbated by the CPU not being able to finish up its work and reach idle as quickly.

        • vient a day ago

          I wonder if Windows and Linux just can't yet work on heterogeneous CPUs as well as macOS does. Intel chose an interesting direction here, going straight from one to three kinds of cores in one chip. I almost never see LPE cores being used on Windows, and on Linux you have obscure soft like Intel LPMD which I tried but was not able to notice any battery life improvements.

          • cosmic_cheese a day ago

            I'm a bit out of my depths here, but I believe a significant contributing factor is how early Apple made multi-CPU Macs available, with the earliest being the summer 2000 revision of PowerMac G4 tower (dual 500Mhz PPC G4s), pre-dating the release of OS X. They made it easier for devs to take advantage of those cores in OS X, because this yielded performance boosts that were difficult to match in the x86 world, which was still heavily single-CPU.

            Because the OS and apps running on it were already taking advantage of multithreading, making them efficiency core friendly was easy since devs only had to mark already-encapsulated tasks as eligible for running on efficiency cores, so adoption was quick and deep.

            Meanwhile on Windows there are still piles of programs that have yet to enter the Core 2 Duo era, let alone advance any further.

            • hamburglar a day ago

              > Apple made multi-CPU Macs available, with the earliest being the summer 2000 revision of PowerMac G4 tower

              Earlier. I did some multiprocessing work on an SMP PowerPC Mac in 1997.

              • nxobject a day ago

                Would this have been with MacOS 7’s Multiprocessing Services? I managed to play with an SMP Mac clone (DayStar Genesis MP), but all I really could do in the end is use some plugins for Photoshop.

                • hamburglar a day ago

                  That does not ring a bell. I believe it was macOS 8 or 9.

    • thewebguyd 2 days ago

      > The reduced horsepower relative to M-series isn't a problem for me as much as efficiency is

      Same here. I actually don't care for macOS much, and I'm one of those weirdos who actually likes Windows (with WSL).

      I tried the surface laptop 7 with the snapdragon X elite, and it's..OK. Still spins up the fans quite a bit and runs hotter than my 14" M4 Pro. It's noticeably slower than the MacBook too, and isn't instant wake from sleep (though it's a lot better than Wintel laptops used to be).

      So I've been on Apple Silicon macs for the last 4.5 years because there's just no other option out there that even comes close. I'm actually away from my desk a lot, battery life matters to me. I just want a laptop with great performance AND great battery life, silent, runs cool, high quality screen and touchpad, and decent speakers and microphone.

      MacBooks are literally the only computer on the market that checks all boxes. Even if I wanted to/preferred to run Windows or Linux instead, I can't because there just isn't equivalent hardware out there.

      • ako a day ago

        It’s even worse: Parallels on MacBooks runs windows better than a dedicated windows laptop…

        • littlecranky67 a day ago

          Before the Macbook ARM switch, from 2015 onwards I used to run Linux via Parallels and it ran better than any Linux I ever used on a natively on a modern laptop. After installing the parallel tools you had 2D/3D/video acceleration, clipboard sharing, Wifi/Ethernet bridging, and most importantly - seemless and stable suspend/resume.

        • balder1991 a day ago

          And if you don’t want to pay a subscription, VMWare also breaks no sweat since a long time ago, and it’s very polished at this point.

          I recently tried VirtualBox and it’s finally catching up, seems to work without any problems but I didn’t test it enough to find out the quirks.

        • olyjohn 19 hours ago

          I doubt that's true except for in your mind. What are you comparing it to? And old-ass Windows laptop?

          • ako 3 hours ago

            I moved from a lenovo thinkpad p1 gen 2 core i9 32Gb (2020) to a macbook pro m1 max 32gb (2021), and the experience in parallels beats the experience in the lenovo machine.

    • rapind 2 days ago

      > On the desktop front I think AMD has been killing it but both companies need to start from scratch for laptops.

      IMO Apple is killing it with the mac mini too. Obviously not if you're gaming (that has a lot to do with the OS though), but if you're OK with the OS, it's a powerhouse for the size, noise, and energy required.

      • cosmic_cheese 2 days ago

        Yeah for most "normal" users the Mini is pretty ideal. It's got enough power that it's overkill for most folks while being the least intrusive a desktop could possibily be: it's tiny, it doesn't have a power brick, it doesn't make any noise, and it's not going to impact your power bill hardly at all.

        • whyoh a day ago

          >it doesn't make any noise

          You can hear the fan at full load, especially on the M4 Pro. I really wish Apple went with a larger case and fan for that chip, which would allow quieter cooling.

          Also, many units are affected by idle (power supply) buzzing: https://discussions.apple.com/thread/255853533?sortBy=rank

          The Mac Mini is quieter than a typical PC, but it's not literally silent like, say, a smartphone.

          • balder1991 a day ago

            That might be a recent phenomenon caused by the inevitable heat of the CPU getting closer and closer to its limit? Like explained in this video: https://youtu.be/AOlXmv9EiPo

            My Mac Mini M2 never does any noise, even when I run FFMpeg the fans don’t spike. It just gets slightly warmer. Still, unless I’m doing these high CPU bound activities, every time I touch it it’s cold as if it was turned off, which is very different than my previous Intel one that was always either warm or super hot.

      • sgarland a day ago

        Even if you are into gaming, between native builds and Crossover, it’s quite capable. It’s not going to match a top of the line Windows build with a dedicated GPU, but it’s shockingly capable.

      • MangoToupe 2 days ago

        I've been running a Mac mini as a gaming machine for years; an egpu is much cheaper than building a whole new desktop tower.

        • astrange 2 days ago

          Apple Silicon Macs don't support eGPUs. (At the moment anyway.)

    • hopelite a day ago

      This is just my perspective, but it seems that whatever is leading them to do so, the focus on supporting the Windows environment is extremely hamstringing. Apple effectively controls the whole hardware and software stack of any given device, AMD/Intel don't even really control the main board, let alone efficiencies between all the compatibilities.

      No wonder the Ferrari of computers if more efficient and effective than hobbled together junk yard monstrosity... ok, I'll be more generous... the Chrysler of computers.

      I don't want to suggest that Apple is ideal with its soldered constrictions, or that modularity should be done away with, but reality is that it seems to me that standards need to be tightened down A LOT if the PC market really wants to compete. I for one have no problem not dealing with all the hassle of non-Apple products because I can afford it. If Apple got its botoxed, manicured head out of their rear ends and started offering their products at competitive prices, they would likely totally dominate the majority of computing market, which would likely atrophy and effectively die out over time.

      Let's hope that Apple remains pretentious and sturdy greedy so that at least we have choice and it gives the PC sector at least a chance to get their standards in order, maybe even funding a gold standard functional linux distro that could at least hold water to MacOS without drooling all over itself.

  • kwanbix 2 days ago

    If you are OK with the closed apple ecosystem, sure, but I mean, 20% is not that much for 99% of the population.

    Don't get me wrong, I really admire what apple has done with the M CPUs, but I personally prefer the freedom of being able to install linux, bsd, windows, and even weirder OSes like Haiku.

    • traceroute66 2 days ago

      > but I mean, 20% is not that much for 99% of the population.

      As long as you're ok being tethered to the wall, and even then, guzzling power.

      The whole point of Apple Silicon is that its performance is exactly the same on battery as tethered to the wall AND it delivers that performance with unmatched power efficiency.

      Its the same on pure desktop. Look at the performance per watt of the Mac Mini. Its just nuts how power efficient it is. Most people's monitors will use more power than the Mac Mini.

      • pico303 2 days ago

        My “fancy” Windows work laptop has 45 minutes of battery life, while my M3 MacBook Pro will go 14 hours compiling C++ or running JavaScript and Docker images, and do so twice as fast as my work laptop could. I’d say you get what you pay for, but my work laptop was around the same price as my M3.

        I wouldn’t be opposed to going back to Linux. But once you stop looking for power sockets all the time and start treating your laptop like a device you can just use all day at any moment, it’s hard to go back.

        • spwa4 2 days ago

          That's because your company's security department has virus scanners scanning every bit of code (including 99% of the virus scanner itself).

          • pico303 2 days ago

            My company literally has four different apps “protecting” me now, including two different malware scanners. Neovim runs like it’s a 286. That said, before they’d installed everything it still wasn’t any faster than my Mac.

      • zdragnar 2 days ago

        I was just looking at an HP laptop with a snapdragon X processor that claimed 34 hours of battery life while watching video.

        It'd be tempting if I had any idea what the software compatibility story would be like. For example, the company I'm contracting with now requires a device monitor for SOC2 compliance (ensuring OS patches are applied and hard drive encryption remains on). They don't even want to do it, but their customers won't work with them without it.

        Surprise surprise, a quick check of the device monitor company's website shows they don't support ARM architecture devices at all.

        • thewebguyd 2 days ago

          It may still work. The prism emulation is pretty good, almost on par with Rosetta2.

          I have the surface laptop 7 with the X elite in it. The only thing I've ran into that outright didn't run was MSSQL server.

          It's not my main machine, that is still an M4 Macbook pro but I hop on it occasionally to keep up with what Windows is doing or if I need to help someone with something windows specific. I've got WSL2, Docker, VSCode, etc. all running just fine.

          It's decent, but not amazing. Feels a little slower than my M2 Air I have but not much, most of that is probably just windows being windows.

          Would be nice to be able to get Linux running on one of these

          • zdragnar 2 days ago

            Sadly, I'm doing dotnet work, including a legacy webforms codebase. Not running mssql server directly, but lots of other tools- visual studio, sql server profiler, sql server management studio, that sort of thing. EVEN IF all of that worked, I have already verified from the company that supplies the device management software that they don't support non-x86 architectures.

            • thewebguyd 2 days ago

              Bummer. They are neat little laptops, and with the X elite 2 (assuming they end up in some windows laptops and aren't exclusively for the new android chromebooks) it's about the closest we'll get to a MacBook on Windows for now.

              I wish Microsoft put more pressure on vendors to support ARM.

        • dijit 2 days ago

          The last Snapdragon X Elite claims really didn't pan out though.

          Which left me bitter quite honestly as I was looking forward to them a lot.

    • dijit 2 days ago

      I keep hearing this, but I'd venture that a majority of those making it will most likely end up on Windows full time anyway. Which is not materially worse than MacOS, no matter how much MacOS is shooting themselves in the foot.

      • davrosthedalek 2 days ago

        With WSL2, Windows is better, sad, but true.

        • dijit 2 days ago

          Hahahahhahahahahhahahahahaha

          No.

          Even if it was better than lima (and the builtin posix/unix environment), which: it ain’t, it doesn’t nearly make a dent in the mandatory online account, copilot shit and all the rest.

        • illusive4080 a day ago

          This is a very subjective take.

          If you like Windows, you’ll find it better with WSL2. In fact, I see many developers at my org who claim they’ll switch to Windows (from Mac) when we make it available internally.

          However, if you love Mac, you'll never find Windows palpable no matter what.

          And then there’s all shades of gray.

        • planb a day ago

          You may like Windows better, but WSL2 is just a virtual machine with all the downsides (slower, no docker) that brings . In fact, on my windows PC I still use WSL1 for that reason.

    • Zak 2 days ago

      It does not appear to me that Macs are closed in the sense that iOS is. It is possible, at least to install Linux on Apple silicon Macs.

      There are certainly many more options on the PC side, but it's not because Apple actively blocks users from running another OS.

      • kwanbix 2 days ago

        As far as I understand, the only Linux you can install on an M CPU is Asahi linux. Apple is not doing anything actively, but is also doing nothing to help linux be ported.

        • dagmx a day ago

          A big issue there is that there’s a massive backlog of patches to land in the kernel and Asahi are currently working on reducing that.

          Once that’s done, any distro should be able to work.

        • dmitrygr a day ago

          You have no idea how much work everyone inside the kernel and iBoot teams at Apple put into making it possible to run Linux on those MacBooks!

    • whycome 2 days ago

      20% is just the performance difference. They noted the low cost for an Air model as well. What would an equivalent be at that price point? Would it have the same passive cooling and weight features?

    • Greed 2 days ago

      Agreed. Even as an enthusiast if I could take the performance hit and keep the M4's battery life, I'd do it in a heartbeat just for the ability to run linux.

    • elAhmo 2 days ago

      Huge majority people don't really case about whether an ecosystem is closed or not. Power users, such as developers, actively chose Macbooks, and those users are most likely to care about that.

      You really think an average person shopping for a computer at Bestbuy cares about installing a different OS on their machine?

    • carlosjobim a day ago

      How about running your Linux and Windows etc virtually on a Mac? From what I've understood, people say it works great. But I haven't any experience myself.

    • jamespo 2 days ago

      I'd like to have haiku as a boot option, but how well does it work on modern laptop hardware?

    • MangoToupe 2 days ago

      > I personally prefer the freedom of being able to install linux, bsd, windows, and even weirder OSes like Haiku.

      I certainly don't think that matters to the vast majority of the population

      • VBprogrammer 2 days ago

        The majority of the population is running a $300 laptop from Amazon. They certainly aren't popping used car money every 2-3 years like the real enthusiasts are.

        • MangoToupe a day ago

          > They certainly aren't popping used car money every 2-3 years like the real enthusiasts are.

          Sorry, I don't get the reference. What sort of expenses are you referring to? For the price of a used car you can get pretty much any workstation money can buy.

  • mrheosuper a day ago

    >$800 the M4 Air just seems like one of the best tech deals

    Not at all, you are stuck with a machine that has only 256GB of ssd(and not upgradeable) and 60hz LCD screen, also only 2 IO ports.

    the M4 maybe the best mobile CPU, but that does not mean it will make every machine with it the best.

    • littlecranky67 a day ago

      As an owner of an Macbook Pro M4 I am actually looking to sell and downgrade to the Air - because during my day-to-day developing I can't get close to utilizing the M4. Regarding the limitations: 256GB is enough for me, because 95% of the time I use my Macbook docked into my Thunderbolt dock, where I have an external drive (also for Timemachine backups) so plenty of space for big files that I rarely use of can do without on the go. That also solves the limitation of I/O ports, because the Dock brings plenty of USB ports (plus the ones from my screens which are connected to the Dock).

      Not sure why 60Hz is a limitation, I've been on 60Hz for 20 years+, outside of gaming I do not see any value in going higher.

      • sgarland a day ago

        I went from a base M1 Air to a mid-spec’d M4 Pro. My main reasons were the 14” screen size, and the lack of a wedge shape for the new Airs (I loved the wedge). IMO, 14” is the perfect size for a laptop that actually sees time in your lap. My 13” Air was fine, but the extra inch is just enough to make a difference, while not feeling overly bulky.

        Re: refresh rate, it’s nice, but I wouldn’t miss it most of the time. I have my external monitor for my work M3 running at 120 Hz because I can, not because I need it.

      • mrheosuper 15 hours ago

        Not everyone has the luxury of having a rigid setup, that you can just plug in your dock and good to go. In fact, if that is your usecase, mac mini is a much better option.

        • littlecranky67 7 hours ago

          Well the price difference between an M4 MacbookPro and Air buys you several docking stations.

          Also a MacMini is not mobile, and even if I take the mini, it neither has a screen or keyboard. The thing is I don't need it WHILE I'm traveling, I need my laptop at the destination. And because that is less than 5% of the time I use it, there is no issue in carrying the external 2.5" HDD that I keep on my dock. You personal use-case may vary.

  • remix2000 2 days ago

    Depends how you frame it; in my eyes, I'd be paying $1.4k USD after sales tax (at least here in the EU) for a laptop with measly 16 gigs of RAM… I could buy two normal laptops to outperform that for a price of one!

    • ysleepy a day ago

      The 16/256 M4 MacBook Air costs 850€ including 19% vat here in DE, so your numbers are way off or you live somewhere with very unfortunate peicing in the eu.

      • jijijijij a day ago

        Certainly not from the Apple Store, where it's 1.199,00€.

        Cheapest, I found was about 1000€. Buying a one-off offer from some random webshop, means you would have to deal with them in case of repairs or warranty issues.

        And yeah, effectively max. 200GB non-upgradable SSD storage, certainly makes cheap offers likely, cause that's borderline unusable for almost everyone, who needs more than a web browser.

    • criddell 2 days ago

      If you use the $1.4k USD laptop for 2 years, that works out to around $2 / day. If it’s a Mac, it probably has some resale value at the end of the time bringing the cost down closer to $1 / day.

      For a work machine, that’s pretty easy to justify.

      • littlecranky67 a day ago

        I buy Macbooks with Applecare as a self-employed contractor (so no VAT and expense is tax deductible). I sell it after 2-3 years (before apple care expires!) on ebay and you would actually make a profit if I would not tax-declare it. What I personally do not do, as that would be tax evasion. But I've heard of people who don't do that and basically use Macbooks for free/with a profit. Especially since Germany changed the write-off period from 3 years to 12 month recently.

      • remix2000 2 days ago

        I wouldn't expect an Apple product to last that long… This is from my personal experience and also family members who tried Apple, so your mileage may vary, I just wouldn't trust it.

        Ignoring that though, if work machine means an Excel machine, then it's probably overspending IMO. If work machine means workstation, then you'd probably rather want one of the >1.6k models with more working memory… or just don't go Apple.

        • JSR_FDED a day ago

          Every iMac, Mac Pro, MacBook Air, Mac Mini and MacBook Pro I’ve had (for me and family) has been indestructible.

          A few months ago Spotify on an ancient Intel Mac mini in the living room started complaining that the new version of Spotify is no longer compatible with that Mac. Then I ran Open Core and updated the MacOS to a much newer version, and Spotify is happy. Now I’ll get even more years out of that machine.

        • achandlerwhite a day ago

          Interesting, most people have the opposite experience. 4 year old M1 here still amazing perf and OS support.

        • swiftcoder a day ago

          Seems like you had a run of bad luck - I hand my old Macs off to family members, and multiple are well past the 10 year mark at this point.

          Probably need to get the batteries replaced somewhere past the 5 year mark, but otherwise the durability is unmatched.

    • hyper_frog a day ago

      Is there a laptop that can outperform a macbook M4? Genuine question.

    • bayindirh 2 days ago

      Honestly asking, can you use either of them for a decade?

      From replacement parts and physical endurance perspective, I mean.

      • remix2000 2 days ago

        I don't think it's realistic to expect any tech made in the twenties last a decade…

        • bayindirh 2 days ago

          I still expect Mac and Thinkpad hardware to last a decade, sans their batteries. A good desktop PC made from better parts will also endure without much effort.

  • hu3 2 days ago

    > For $800 the M4 Air just seems like one of the best tech deals around.

    Only if you don't mind macOS.

    • CharlesW 2 days ago

      I understand the point you're making, but FWIW I run Windows and Linux under Parallels and it works great. Colima/Lima is excellent, too: https://github.com/abiosoft/colima

      Windows on ARM performance is near native when run under macOS. `virtiofs` mounts aren't nearly as fast as native Linux filesystem access, but Colima/Lima can be very fast if you (for example) move build artifacts and dependency caches inside the VM.

      • traceroute66 2 days ago

        There's also UTM, available free for download or you can make a donation to the devs by purchasing it from the App Store.

      • ruszki a day ago

        > Colima/Lima is excellent

        Except when you need something like UDP ports, for example. I tried it for 2-3 weeks, but I always encountered similar issues. At the end I just started to use custom Alpine VMs with UTM, and run Docker inside them. All networking configured with pf.

      • jijijijij a day ago

        > I understand the point you're making, but FWIW I run Windows and Linux under Parallels and it works great.

        See, that's where the MacOS shitshow begins: Parallels costs €189.99 and it looks like they are pushing towards subscriptions. I am not in the ecosystem, but Parallels is the only hypervisor I've ever seen recommended.

        Another example is Little Snitch. Beloved and recommended Firewall. Just 59€! (IIRC, MacOS doesn't even respect user network configuration, when it comes to Apple services, e.g. bypassing VPN setups...)

        Now, don't get me wrong, I am certain there are ways around it, but Apple people really need to introspect what it commonly means to run a frictionless MacOS. It's pretty ridiculous, especially coming from Linux.

        I mean c'mon... paying for a firewall and hypervisor? Even running proprietary binaries for these kind of OS-level features seems moderately insane.

    • matdehaast 2 days ago

      And therein lies the problem. Apple has managed to push a hardware advantage into something that makes a difference.

    • apercu 2 days ago

      I used to not, but it’s getting worse and worse.

      Still better than all the alternatives for someone like me that has to straddle clients expecting MS Office, gives me a *nix out of the box, and can run logic, reaper , MainStage.

      • jijijijij a day ago

        > gives me a *nix out of the box, and can run logic, reaper , MainStage.

        Reaper has a native Linux client. Logic and MainStage... are you serious? :D

  • bhouston 2 days ago

    I use a MacBook Air 15" has my full stack main development machine. It is just light and portable on the go. At home I just plug it into a docking station with 10 GigE and output to a 48" OLED monitor - a beautiful setup.

    • robbinb 2 days ago

      just curious what brand etc monitor and docking station you're using here.

      • littlecranky67 a day ago

        Not the OP, but I use my Macbook Pro M4 with a 5-year old Dell TB19 Thunderbolt dock (not the usb-c one!). I have 2x 1080p screens aged 10+years connected to the dock, and I use the Macbook as 3rd display and the keyboard of the laptop as my daily driver. I've used the very same setup with my 2019 MBP so I assume the M4 Air can handle it too.

      • bhouston 2 days ago

        I have had really good experience with OWC docking stations - rock solid compared to Dell ones I've had in the past: https://www.owc.com/solutions/connectivity

        I won't recommend my monitor because it has auto-dimming you can not turn off. Good but not great.

  • jeswin a day ago

    > I've been looking at non-apple equivalents that have similar performance/power as the M lineup and it seems they all lag about 20%+.

    Most of it can be explained away with TSMC. If you compared Apple's 5nm parts (M2), with AMD's 4nm parts, you'll see the performance swing in just about the same magnitude but in favor of AMD. M5 is 3rd gen 3nm.

  • criddell 2 days ago

    > it seems they all lag about 20%

    So they are about one generation behind? That's not bad really. What's the AMD or Intel chip equivalent to the M2 or M3? Is somebody making a fanless laptop with it?

    • Rohansi 2 days ago

      Apple is also always at least one generation ahead on the process/lithography too because they buy out all of the initial capacity. That alone accounts for a decent chunk of the difference.

      I don't think the market is there for fanless non-Mac laptops. Most people would rather have a budget system (no $ for proper passive cooling) or more powerful system.

      • criddell 2 days ago

        > Most people would rather have a budget system

        The low end of the market is for sure bigger but I think Apple has shown that the higher end can be profitable too. Dell, HP, Lenovo, and the other big laptop makers aren't afraid of having a thousand different SKUs. They could add one more for a higher end machine that's fanless.

        • Rohansi 2 days ago

          Most of those SKUs are component swaps. A proper passively cooled laptop would require a completely different chassis design to act as an extension of the heatsink.

          I bought a MacBook Air because it was cheaper and met my needs. Being passively cooled was just a nice bonus.

          • criddell a day ago

            As someone who has been buying Think Pads for the past 20 years, Lenovo needs to spend more time working on thermals anyway. If I don't power my laptop down before stuffing it in my backpack, I'll have a hot, almost dead machine by the time I get to my destination.

            • Rohansi 21 hours ago

              I wonder how much of that issue is related to crappy lid open sensors. AFAIK most of them work by sensing a magnet placed in the frame. MacBooks don't do this so your laptop doesn't sleep randomly when a magnet passes over your laptop. It's dumb that they use a single magnet instead of one on each side, but it sure is cheaper.

              • criddell 19 hours ago

                I think it's 75% Microsoft's fault with modern sleep. They want the machine to go into a low power state rather than sleep so that it can still receive email and other notifications just like a phone does. But since the thermals are total garbage on most PCs, that means even the lower power modes need active cooling which doesn't really work when the machine is in a bag. After a few minutes all the fans are blasting.

  • foresterre a day ago

    I find this usually doesn't matter as much as you seem to suggest.

    I've been running linux laptops with AMD/intel for years, and while some focus on more battery life would be welcome, the cpu never bothered me.

    My primary limiation is available RAM (esp. when debugging react native against a complete local docker setup), which unfortunately on both AMD/Intel but far more so on Apple is usually limited to higher compute CPU's, which drives up cost of laptop (not even including the extra cost of RAM on Apple).

    The only really locally CPU intensive processes I run on a laptop are Rust builds, and then still I would prioritize RAM over CPU if that would be possible, because Rust builds are fast enough, especially when incrementally compiling (and I almost never do lto release-like builds locally unless doing benchmarking or profiling work)

    • IshKebab a day ago

      > Rust builds are fast enough

      I love Rust but I think you might be the first person to say that!

  • franczesko 2 days ago

    AMD asleep? I didn't think this is accurate

    • matdehaast 2 days ago

      If you read the first half of the sentence then yeah.... The complete sentence clarifies "with their mobile lineup"

      • makeitdouble a day ago

        The 395 max is in laptops and slots between the M3 and M4 on geekbench scores. That's not top of the line, but a decent result IMHO.

  • pizza234 2 days ago

    Each Ryzen generation increased performance significantly, so AMD is definitely not asleep at the wheel.

    • matdehaast 2 days ago

      I'm talking specifically about their mobile lineup, not desktop. And more specifically the performance to power efficiency the M series is getting. It is more than 2 generations behind.

      • pizza234 2 days ago

        Each Ryzen mobile generation also improved efficiency, in particularly the last generation (AI PRO 300).

        Intel, on the other hand, started a few generation ago with an edge in terms of efficiency, and now they're behind; they are definitely the one that fell asleep.

        The fact that ARM may have unreachable efficiency doesn't mean that AMD, as x86 producer, is doing nothing.

  • satellite2 2 days ago

    For the same price you have an AMD Ryzen AI 9 365 which has 50% higher performance in cpubenchmark

    • matdehaast 2 days ago

      Think you are mistaken. The M4 beats the Ryzen AI 365 in both single and multicore benchmarks

      • satellite2 2 days ago
        • FootballMuse 2 days ago

          Passmark is an outdated benchmark not well optimized for ARM. Even so the single thread marks are 3864 (AI365) vs 4550 (M4)

          OTOH, Geekbench correlates (0.99) with SPEC standards, the industry standard in CPU benchmark and what enterprise companies such as AWS use to judge a CPU performance.

          https://medium.com/silicon-reimagined/performance-delivered-...

          • astrange 2 days ago

            Hmm, why would you need to optimize a benchmark for something? Generally it's the other way round.

            • aleph_minus_one 2 days ago

              > Hmm, why would you need to optimize a benchmark for something? Generally it's the other way round.

              It had always been both ways. This is why there exist(ed) quite a lot of people who have/had serious thoughts whether [some benchmark] actually measures the performance of the e.g. CPU or the quality of the compiler.

              The "truce" that was adopted concerning these very heated discussions was that a great CPU is of much less value if programmers are incapable of making use of its power.

              Examples that evidence the truth of this "truce" [pun intended]:

              - Sega Saturn (very hard to make use of its power)

              - PlayStation 3 (Cell processor)

              - Intel Itanium, which (besides some other problems) needed a super-smart compiler (which never existed) so that programs could make use of its potential

              - in the last years: claims that specific AMD GPUs are as fast as or even faster than NVidia GPUs (also: for the same cost) for GPGPU tasks. Possibly true, but CUDA makes it easier to make use of the power of the NVidia GPU.

              • astrange 2 days ago

                > - Intel Itanium, which (besides some other problems) needed a super-smart compiler (which never existed) so that programs could make use of its potential

                Well, no such thing is possible. Memory access and branch prediction patterns are too dynamic for a compiler to be able to schedule basically anything ahead of time.

                A JIT with a lot of instrumentation could do somewhat better, but it'd be very expensive.

AlphaAndOmega0 2 days ago

I own an M4 iPad Pro and can't figure out what to do with even a fraction of the horsepower, given iPadOS's limitations. The rumors about an upcoming touchscreen Mac are interesting, perhaps Apple will deign to make their ridiculously overpowered SOCs usable for general purpose computing. A man can dream..

  • Fwirt 2 days ago

    There are a number of interesting creative apps for iPad that can make full use of its capabilities. A good example is Nomad Sculpt. There's also CAD software, many DAWs. I haven't tested Numbers yet but I would assume its fairly well optimized.

    This really reminds me of the 80/20 articles that made the frontpage yesterday. Just because a lot of HN users lament the fact that their 20% needs (can't run an LLM or compile large projects on an iPad) aren't met by an iPad doesn't mean that most people's needs can't be satisfied in a walled garden. The tablet form factor really is superior for a number of creative tasks where you can be both "hands on" with your work and "untethered". Nomad Sculpt in particular just feels like magic to me, with an Apple Pencil it's almost like being back in my high school pottery class without getting my hands dirty. And a lot of the time when you're doing creative work you're not necessarily doing a lot of tabbing back and forth, being able to float reference material over the top of your workspace is enough.

    At this point Apple still recognizes that there is a large enough audience to keep selling MacBooks that are still general purpose computing devices to people who need them. Given their recent missteps in software, time will tell if they continue to recognize that need.

    • mort96 2 days ago

      I would not want to use CAD software or a DAW without a proper mouse and keyboard, and maybe a 3D mouse too. An interface made for touch really isn't suitable. Even connecting a mouse to an iPad is a pretty shitty experience, since all the UI elements are too big and you have to wait around for animations to finish all the time.

      • WillAdams 2 days ago

        Shapr3D is an interesting 3D design tool which has some CAD capabilities and an interface optimized for use with a stylus --- Moment of Inspiration was similarly oriented (I really ought to try it).

      • scarface_74 2 days ago

        How is connecting a blue tooth mouse to an iPad any different than connecting to a computer? Especially with iPad OS 26?

        • monkmartinez 2 days ago

          That is just one very simple part, connecting the mouse. Literally everything else sucks on iOS. File management, hidden menus, running multiple apps, system management... and the list goes on. Need to convert a STEP file to something else on the iPad? Download 15 apps to see which one works, then try to find the converted file in the abomination of a file system? iOS is hot garbage.

          • NetMageSCW a day ago

            iPadOS works differently from macOS and if you aren’t willing to learn that, you will think it is bad. The problem isn’t the OS.

            • mort96 17 hours ago

              What if you've learned how to work around many of iPadOS's limitations and still think those limitations are bad?

              Downloading 15 different paid or free-with-in-app-purchases or free-with-ads apps to see which one actually does what it's supposed to do is one of those workarounds. I've learned how to do it and done it a bunch of times and I don't really like it. I much prefer the macOS/Windows/Linux workflow where there's typically some established, community run and trustworthy FOSS software to do whatever conversion you need.

    • CompoundEyes 2 days ago

      I work with Logic Pro X often. I bought an iPad Pro M4 and the Logic version for it is really compelling. Touch faders and the UI are well thought out. The problem is they want me to subscribe to use it. I wish I could just outright purchase it for $300.

      • 1123581321 a day ago

        They should charge less if offering one-time. $300 only beats $50/year after 6-7 years, depending on the discount value you would assign to a present value calculation. In software it's more typical to calibrate that around 2-3 years. I like the design as well.

      • wintermutestwin a day ago

        >The problem is they want me to subscribe to use it.

        WTH?? This is the first I am hearing this nonsense. Yet another reason why I won't get an iPad even though I am all in on Apple's ecosystem. It seems that Apple sees iPad users as the bottom feeders ripe for exploitation.

    • serbuvlad 2 days ago

      Yes but there is simply no reason to have two devices. There are a large number of Windows tablet-laptop combo machines that work perfectly well and prove touch apps work perfectly well on a desktop OS.

      Yeah, that took a long time for MS to get to not suck after Windows 8, but touch and tablet interactions on Windows 10 and Windows 11 work perfectly well.

    • bigyabai 2 days ago

      > There's also CAD software, many DAWs.

      Assertions like this are what kill the iPad. Yes, DAWs "exist" but can only load the shitty AUs that Apple supports on the App Store. Professional plugins like Spectrasonics or U-He won't run on the iPad, only the Mac. CAD software "runs" but only supports the most basic parametric modeling. You're going to get your Macbook or Wintel machine to run your engineering workloads if that's your profession. Not because the iPad can't do these things, but because Apple recognizes that they can double their sales gimping good hardware. No such limitations exist on, say, the Surface lineup. It's wholly artificial.

      I'm reminded of Damon Albarn's album The Fall - which he allegedly recorded on an iPad. It's far-and-away his least professional release, and there's no indication he ever returned to iOS for another album. Much like the iPad itself, The Fall is an enshrined gimmick fighting for recognition in a bibliography of genuinely important releases. Apple engineers aren't designing the next unibody Mac chassis on an iPad. They're not mixing, mastering and color-grading their advertisements on an iPad. God help them if they're shooting any footage with the dogshit 12MP camera they put on those things. iPads do nothing particularly well, which is acceptable for moseying around the web and playing Angry Birds but literally untenable in any industry with cutting-edge, creative or competitive software demands. Ask the pros.

      • hahayup 2 days ago

        It's such a shame that the iPad has these limitations. It's such an incredible device–lightweight, very well designed, incredible screen, great speakers, etc. I really do feel that if Apple sold a MacBook in the style of a Surface Book: iPad tablet running MacOS which could dock to a keyboard and trackpad with a potential performance boost (graphics card, storage, whatever), that it would be my dream device.

        • bigyabai 2 days ago

          All I want is to put Linux on it. I already own copies of Bitwig et. al, if the iPad Mini didn't lock me into a terrible operating system then I might want to own one. But I'm not spending $300 for the "privilege" of dealing with iPadOS.

  • zaptrem 2 days ago

    > perhaps Apple will deign to make their ridiculously overpowered SOCs usable for general purpose computing

    They've been doing exactly this since the first M1 MacBooks came out in 2020.

  • lanza 2 days ago

    > I own an M4 iPad Pro and can't figure out what to do with even a fraction of the horsepower, given iPadOS's limitations.

    Literally everything you do gets the full power of the chips. They finish tasks faster using less power than previous chips. They can then use smaller batteries and thinner devices. A higher ceiling on performance is only one aspect of an upgraded CPU. A lower floor on energy consumed per task is typically much more important for mobile devices.

    • mort96 2 days ago

      Right but what if I don't notice the difference between rendering a web page taking 100ms and it taking 50ms? What if I don't notice the difference between video playback consuming 20% of the chip's available compute and it consuming 10%?

      • GeekyBear 2 days ago

        I'm pretty sure that users of the announced Blender for iPad port will notice any additional horsepower.

      • Aurornis a day ago

        > but what if I don't notice the difference between rendering a web page taking 100ms and it taking 50ms?

        You probably won’t notice this when using the new machine.

        For me, it only becomes noticeable when I go back to something slower.

        It’s easy to take the new speed as a given.

        > What if I don't notice the difference between video playback consuming 20% of the chip's available compute and it consuming 10%?

        You would notice it in increased battery life. A CPU that finishes the task faster and more efficiently will get back into low power mode quicker.

      • simonh 2 days ago

        Faster can also mean more efficient for a lot of tasks, because the cpu can idle sooner so your battery can last longer, or be smaller and lighter.

    • monkmartinez 2 days ago

      "Literally everything" doesn't amount to much if I can't actually control the stupid thing.

  • cromka 2 days ago

    Since Apple actually makes a significant amount of money selling hardware itself, I really wonder why they actually wouldn't allow people to install Linux on it, with a full support. After all, it's not like this would jeopardize macOS/iPadOS AppStore earnings — Linux users would simply buy into Apple Hardware they haven't even considered before, and only a fraction of macOS/iPadOS users would switch to using Linux.

    • socalgal2 2 days ago

      do they disallow it or just not provide active support? Active support requires paying for employees to keep it working. Ignoring it and having volunteers do it requires nothing.

      • cromka 2 days ago

        You make it sound like these are the only two options, meanwhile what they _most importantly_ fail to deliver is documentation.

        And that's for macOS. For any other platgorm they actively prohibit any third party operating systems.

      • bee_rider a day ago

        I think the comment up one was about Linux on the iPad, which is mostly impossible. Well, iirc there are some projects to get, like, Alpine Linux running inside iOS, but it is emulated or something, and pretty slow, no gui, quite limited, etc etc.

    • dangus a day ago

      Last I checked, Apple makes more revenue on services than on Mac and iPad combined. With higher profit margins.

  • runjake 2 days ago

    Questions for you:

    1. If you don't know what to do with it, why did you buy it?

    2. If you wanted a general purpose computer, why did you buy an iPad?

    3. Which iPadOS limitations are particularly painful for you?

    • wintermutestwin a day ago

      >Which iPadOS limitations are particularly painful for you?

      Browser engine lock-in - no Firefox+uBlock Origin = me no buy. And yes, there is Orion, which can run uBlock, but it and Safari have horrible UI/UX.

    • Rohansi 2 days ago

      There are other differences with the iPad Pro lineup unrelated to the SoC. It's just strange to think that a very capable laptop chip is being put into a device with far more limitations.

      • runjake 2 days ago

        I'd rather that than an underpowered chip.

        It was mentioned, as almost a side comment somewhere, that the M chip is in there for multitasking and higher end image/video editing for "Pros". I could certainly use the M4 in an iPad Pro for iPadOS 26 and it's multitasking. I run into occasional slowness when multitasking on my M2 iPad Air.

    • AlphaAndOmega0 a day ago

      1. I do know what to do with it. I take notes, a lot, in my work as a doctor. That's been the case since I owned an iPad Air from 2020, which I replaced with an 11 inch M1 iPad Pro (which broke), and I finally caved and bought a 13" iPad Pro to replace it. I ended up getting the M4 model because there just didn't seem to be older ones reasonably available. Even the M1 was more than fast enough for the overwhelming majority of iPadOS applicantions.

      Why an iPad? Android tablets have been... not great for a long time. The pencil is very handy, and the ecosystem has the best apps. Also, I know a few rather handy tricks Safari can do, such as exporting entire webpages as PDF after a full-screen screenshot, that are very useful to my workflow.

      2. I already own multiple general purpose computers. They're not as convenient as an iPad. My ridiculously powerful PC or even my decent laptop doesn't allow the same workflow. However, that's not an intentional software limitation, it's a consequence of their form factor, so I can't hold Microsoft to blame. On the other hand,Apple could easily make an iPad equivalent to a MacBook by getting out of the way.

      3. The inability/difficulty of side-loading apps, the restriction to a locked down store. Refusing to make an interface that would allow for laptop-equivalent usage with an external/Bluetooth m+k. You can use an external monitor, but a 13" screen should already be perfectly good if window management and M+K usage wasn't subpar. Macs and iPads have near identical chips (the differences between an M chip for either are minor), and just being able to run MacOs apps on device would be very handy. Apple has allowed for developer opt-out emulation of iOS and iPadOS apps on Mac for a while now, why not the other way around?

      If not obvious from the fact that I'm commenting on HN, I would gain utility from terminal access, the ability to compile and run apps on device, a better filesystem etc. Apple doesn't allow x86 emulators, nor can I just install Proton or Wine. If I can't side-load on a whim, it's not a general purpose computer. I can't use a browser that isn't just reskinned Safari, which rules out a great deal of obvious utility. There are a whole host of possible classes of apps, such as a torrent manager, which are allowed on other platforms but not on iPadOS. It's bullshit.

      My pc and laptop simply aren't as convenient for the things I need an iPad for, and they can't be. On the other hand, my iPad could easily do many things I rely on a PC for, if Apple would get out of the way. iPadOS 26 is a step in the right direction, but there's dozens left to go.

      • runjake 21 hours ago

        Thanks for the response. I'd say you are a key target audience for the iPad Pro. I see most of your points. The only point I can't grok is "the CPU is too good", but I suppose that is more about lamenting the lack of Mac-like functionality than the CPU.

        All I can say is: stay tuned.

  • doctoboggan 2 days ago

    > The rumors about an upcoming touchscreen Mac are interesting

    What rumors have you seen? Anytime I've seen speculation, Apple execs seem to shut that idea down. Is there more evidence this is happening? If anything, Apple's recent moves to "macify" iPadOS indicate their strategy is to tempt people over into the locked down ecosystem, rather than bring the (more) open macOS to the iPad.

    • cosmic_cheese 2 days ago

      Current rumors point to the M6 generation of MBPs being a significant redesign and featuring an OLED touch panel screen.

      I don't understand the appeal, even a little bit. Reaching up to touch the screen is awkward, and every large touchpanel I've used has had to trade off antiglare coating effectiveness to accomodate oleophobic coating. For me, this would be an objective downgrade — the touch capability would never get used, but poor antiglare would be a constant thorn in my side. I can only hope that it's an option and not mandatory, and I may upgrade once the M5 generation releases (which is supposedly just a spec bump) as insurance.

      • WillAdams 2 days ago

        It's convenient, and it also makes usage of a stylus far easier.

        FWIW, I often rotate my Samsung Galaxy Book 3 Pro 360 so that the screen is in portrait mode, then hold the laptop as if it's a book and use a stylus and touch on the display with my right hand, and operate the keyboard for modifiers/shortcuts with my left, or open it flat on a lapdesk.

      • mc32 2 days ago

        Smudges are off-putting... but, there are times when it would be very convenient to be able to scroll or click on a touchscreen. There are times when presenting when a touchscreen would be preferred over a mouse or touchpad. It's not often, but they are nice to have.

        And, in regards to smudges, I mean, just don't use the touchscreen unless you have to and problem avoided.

        Antiglare can be a thing but that can be avoided by avoiding string lighting behind you.

        • cosmic_cheese 2 days ago

          There's still the issue of accidentally triggering things (when e.g. adjusting the screen) and sometimes you don't have control of your surrounding lighting. I'd still prefer touch to be entirely optional.

    • wlesieutre 2 days ago

      https://x.com/mingchikuo/status/1968249865940709538

      > @mingchikuo

      > MacBook models will feature a touch panel for the first time, further blurring the line with the iPad. This shift appears to reflect Apple’s long-term observation of iPad user behavior, indicating that in certain scenarios, touch controls can enhance both productivity and the overall user experience.

      > 1. The OLED MacBook Pro, expected to enter mass production by late 2026, will incorporate a touch panel using on-cell touch technology.

      > 2. The more affordable MacBook model powered by an iPhone processor, slated for mass production in 4Q25, will not support a touch panel. Specifications for its second-generation version, anticipated in 2027, remain under discussion and could include touch support.

    • latexr 2 days ago

      > Anytime I've seen speculation, Apple execs seem to shut that idea down.

      They also said they weren’t merging iOS and macOS, and with every release that becomes more of a lie.

      https://www.youtube.com/watch?v=DOYikXbC6Fs

      • CharlesW 2 days ago

        Strategies change. That was 7 years ago, pre-Apple Silicon. It turns out that people want windowing options on their large and expensive tablet, to do long-running tasks in the background, etc.

        • jtbayly 2 days ago

          If that were all they were doing, nobody would be concerned. It’s the crapifying of the MacOS in order to make it work fine with a touch interface that drives everybody bonkers about the slow merge.

          • jama211 2 days ago

            I have Tahoe and it’s just as good at being a desktop os as any of the previous os’s. Not sure what you’re referring to.

            • ethanwillis 2 days ago

              There have been lots of complaints all over the place that contradict your experience.

              One article that talks about it: https://osxdaily.com/2025/09/19/why-im-holding-off-on-upgrad...

              For less discerning users maybe the rough edges aren't that noticeable. But the point of choosing Apple products is you should be a discerning consumer.

              • jama211 17 hours ago

                That article mentions basically that they’ve heard that some apps crash a bit but it’s anecdotal and not uncommon with beta/new upgrades before a patch or two (not uncommon), and that he personally dislikes or has trouble with some of the transparency or other design changes.

                Neither of those things worry me personally, and I think the previous user calling it a “crappification” is still somewhat of an overreaction. Obviously from an accessibility standpoint transparency/legibility is important but as far as I’m aware tweaks are being made and these things can also be turned off or modified in accessibility settings.

              • detourdog 2 days ago

                This a pretty insulting comment. I’m sure ther is better word then discerning.

                • ethanwillis a day ago

                  I was actually trying to be more neutral :\ than saying something like for consumers with taste

                  the point i'm trying to make is that "apple consumers" are more critical.

                  • detourdog a day ago

                    I understand. I have been using a Mac since 1984 and I actually like glass more than the flat aesthetic we have been through. I see it as closer to Aqua and a subtler skeuomorphic effort. I have reported to Apple some of the problems liquid glass has. I see Liquid glass as better than what we had before.

                  • jama211 17 hours ago

                    Also me having a different experience from you doesn’t make me any less critical or discerning or makes me have less taste.

                    There’s no good way to phrase a thought that is fundamentally flawed.

        • latexr 2 days ago

          > That was 7 years ago, pre-Apple Silicon.

          There have been rumours of Apple wanting to shift Macs to ARM chips for 14 years. When they made that announcement, they already knew.

          https://en.wikipedia.org/wiki/Mac_transition_to_Apple_silico...

          It was obvious it was going to happen. I remember seeing Apple announcing iPads doing tasks my Mac at the time could only dream of and thinking they would surely do the switch.

          > It turns out that people want windowing options on their large and expensive tablet, to do long-running tasks in the background

          The problem isn’t them making iOS (or iPadOS) more like macOS, it’s them doing the reverse.

          • CharlesW 2 days ago

            > When they made that announcement, they already knew.

            Yep, the ongoing convergence made that pretty clear. The emphatic "No" was to reassure 2018's macOS developers that they wouldn't need to rewrite their apps as xOS apps anytime soon, which was (and is) true 7 years later.

            This is the same session where Craig said, "There are millions of iOS apps out there. We think some of them would look great on the Mac." and announced that Mohave would include xOS apps. Every developer there understood that, as time went on, they would be using more and more shared APIs and frameworks.

            > The problem isn’t them making iOS (or iPadOS) more like macOS, it’s them doing the reverse.

            That ship has sailed, but it's also completely overblown.

            • latexr 2 days ago

              > but it's also completely overblown.

              Speak for yourself. I for one despise the current direction of the Mac and the complete disregard for the (once good) Human Interface Guidelines. It’s everywhere on macOS now.

              Simple example: The fugly switches which replaced checkboxes. Not only to they look wrong on the Mac, they’re less functional. With checkboxes you can click their text to toggle them; not so with the switches.

              I’m not even going to touch on the Liquid Glass bugs, or I’d be writing a comment the length of the Iliad.

    • justinator 2 days ago

      Chances that there are both a folding iPhone and a Touchscreen Mac somewhere in the skunk works of Cupertino are 100%.

      The Apple Vision Pro was a far more extreme product and was kept pretty well under wraps. (tho a market failure).

      • dwaite a day ago

        The line for market success for a first generation, $3500 VR headset is drawn in different places for different people.

        • justinator 19 hours ago

          Market success? I'd go with profitability.

          Beta quality and expense without value are just some of the reasons for it's failure.

  • layer8 2 days ago

    The iPadOS limitations are largely orthogonal to being able to make use of the available performance, IMO. For example, search in large PDFs could certainly still be faster, and I don’t think it particularly suffers from iPadOS limitations.

  • jsheard 2 days ago

    It'll get even weirder if the rumoured MacBook Lite with an iPhone processor ends up happening. Incredibly powerful tablets constrained by a dumbed down operating system, sold right next to much weaker laptops running a full fat desktop environment.

    • gloxkiqcza 2 days ago

      Well A19 Pro beats M1 in benchmarks so while the rumored MacBook might be weaker than mid to high-end iPads, it won’t be a slow machine in general.

    • dangus a day ago

      Is that really rumored? Sounds like kind of a weak rumor to me. The MacBook Air already exists.

      Apple already makes low cost versions of those, which are the previous models that they continue to manufacture.

  • cainxinth 2 days ago

    I buy the higher end Apple products not because I plan to use all their power immediately, but because I keep my devices a very long time and want them to retain usability right to the end.

    • nerdsniper 2 days ago

      Same here. My launch-day M1 MBP is starting to show its age finally, M5 with twice the perf will be a nice upgrade.

      • addandsubtract 2 days ago

        Is it, tough? I feel like everything on my M1 is still as snappy as it was on day 1. My previous MacBook definitely showed it's game after 4 years, but I'm happy to use this one for at least another 2-4.

        • prawn a day ago

          Mine is an M1 Max and gives me no gripes after four years. Like you, I also felt as though past laptops felt their age sooner. I'm typically using Photoshop, Lightroom, Resolve, Docker and other usual stuff at any given time.

          • rkomorn a day ago

            I wasn't super informed on the Apple silicon laptops, so I was kind of disappointed when my last job gave me a 2-3 year old M1 Max laptop.

            It blew the doors off every other laptop and desktop I've had (before or since).

            When I think back to how quickly obsolete things became when I was younger (ie 90s and 00s), today's devices seem to last forever.

        • Marsymars 2 days ago

          For me it was the memory limits more than CPU speed. Discord, Slack, Teams and a browser, and 16gb on my M1 was basically used up.

          • pylotlight 2 days ago

            And here I am struggling with the 32gb version, always need more :P

  • Aurornis a day ago

    > Apple will deign to make their ridiculously overpowered SOCs usable for general purpose computing

    Did everyone forget that these chips started in general purpose MacBooks and were later put in the iPad?

    If general purpose computing is the goal you can get a cheap Mac Mini

  • bapak 2 days ago

    > I own an M4 iPad Pro and can't figure out what to do with even a fraction of the horsepower

    Look at glassy UIs. Worth it.

  • adrr a day ago

    Emulators because IpadOs doesn’t allow dynamic dispatch so you need as much CPU as possible.

  • socalgal2 2 days ago

    AFAICT, lots of "AI" related stuff runs slow on M1,M2,M3,M4

    I don't know if this already exists but it would be nice to see these added to benchmarks. Maybe it's possible to get Apple devices to do stable diffusion and related tech faster and just needs some incentives (winning benchmarks) for people to spend some effort. Otherwise though, my Apple Silicon is way slower than my consumer level NVidia Silicon

  • reaperducer 2 days ago

    I own an M4 iPad Pro and can't figure out what to do with even a fraction of the horsepower, given iPadOS's limitations.

    It's a nice problem to have, since for most of computing history it's been the other way around. (Meaning the hardware was the constraint, not the OS.)

    • jchw 2 days ago

      I disagree. For a lot of the personal computing era, the problems with OSes and hardware were mostly a matter of technical progress. The problem with iPadOS is totally different; it's a problem that was basically manufactured in and of itself, and completely artificial at that. I do not think this is a good problem to have at all.

      • dangus a day ago

        I don’t think you’re representing the state of iPad accurately.

        In iPadOS 26, more extensive multi-window multitasking like Mac was added.

        The quantity of windows you can keep open at once depends on your iPad’s SoC.

        If you have a newer Pro iPad with more memory you can keep more of them open and slow down happens further down the rabbit hole.

        The hardware is being pushed and used.

        As another example, the iPad has pretty intensive legitimate professional content creation apps now (Logic Pro, Final Cut Pro, and others).

        So there are people out there pushing the hardware, although I’ll join those who will say that it’s not for me specifically.

        • jchw a day ago

          I don't suggest the problem is that the hardware can't be "pushed and used", the problem is that the hardware is being artificially limited by Apple for some unknowable reason. (Well, the reason is knowable, but I'm sure some would dispute it anyways. It's very clearly an extreme defense of their 30% cut on all software that runs on iOS devices.) This is not a question, it doesn't matter what people are doing with iPads, it really is happening. A good example, the first iPad with hardware virtualization support in its CPU could initially run VMs provided you had a jailbreak, but then Apple entirely removed the virtualization framework from the following iPadOS update.

          There is no particular reason a general purpose computer should be "not for me specifically" in terms of what you can do in software. In terms of design, sure. But not in terms of what you can do in software.

          (I have a suspicion the same reason is responsible for why you basically don't find open source software on iOS devices the way you would on even Android or Windows; it doesn't make any money to take a cut out of.)

    • dijit 2 days ago

      I suppose, that's an interesting way of framing it - yet in my gut I feel like I am paying for something that I am locked away from.

      Sometimes though Youtube will make the iPad uncomfortably hot and consume the battery at an insane pace.

      So, I guess there's someone using the performance.

    • sib 2 days ago

      >> I own an M4 iPad Pro and can't figure out what to do with even a fraction of the horsepower, given iPadOS's limitations.

      > It's a nice problem to have, since for most of computing history it's been the other way around. (Meaning the hardware was the constraint, not the OS.)

      For anyone who works with (full-size) image or video processing, the hardware is still the constraint... Things like high-ISO noise reduction are a 20-second process for a single image.

      I would be happy to have a laptop that was 10x as fast as my MacBook Pro.

    • bigyabai 2 days ago

      I don't think hardware has been a real constraint since the Pentium era. We've been living in a world of CPU surplus for close to 2 and a half decades, now.

      • dylan604 2 days ago

        I've been RAM limited more than CPU limited for some time. In my personal workflows, 32GB was not enough and I'd receive out of memory errors. I bumped that up to 64GB and the memory errors went away. This was in a Hackintosh so RAM upgrades were possible. I've never tried an M* series chip to see how it would behave with the same workflow with the lower RAM available in affordable machines.

  • ajross 2 days ago

    > can't figure out what to do with even a fraction of the horsepower

    That's sort of the funny thing here. Apple's situation is almost the perfect inverse of Intel's. Intel fell completely off the wagon[1], but they did so at exactly the moment where the arc of innovation hit a wall and could do the least damage. They're merely bad, but are still selling plenty of chips and their devices work... just fine!

    Apple, on the other hand, launched a shocking, world-beating product line that destroys its competition in basically all measurable ways into a market that... just doesn't care that much anymore. All the stuff we want to spend transistors on moved into the cloud. Games live on GPUs and not unified SOCs. A handful of AI nerds does not much of a market make.

    And iOS... I mean, as mentioned what are you even going to do with all that? Even the comparatively-very-disappointing Pixel 10 (I haven't even upgrade my 9!) is still a totally great all-day phone with great features.

    [1] As of right now, unless 18A rides in to save them, Intel's best process is almost five YEARS behind the industry leader's.

    • tyleo 2 days ago

      It’s surprising to me MacBooks have such low market share. I got my first Mac after using Windows all my life and I’m stunned. The laptop: 1. Lasts all day on battery 2. Doesn’t get hot 3. Compiles code twice as fast as my new Windows desktop

      I really don’t like macOS but I’ve shifted to recommending Mac to all my friends and family given the battery, portability and, and speed.

      • nhod 2 days ago

        It definitely depends on what circles you run in. When someone I know or is a degree of separation away from me pulls out a PC, it is always a little bit of a surprise.

      • jyap 2 days ago

        Regarding market share and your friends and family recommendations, you’re thinking first world. Rest of the world wants and can only afford sub-$500 laptops.

        • criddell 2 days ago

          I’ve found that the $1000 Mac laptop is worth about $500 after 3 years and the $500 laptop is worth $50. The price difference over time really isn’t that big and the Mac is going to have a better trackpad and display and longer battery life.

          • Marsymars 2 days ago

            Yeah but in the longer term the price trends to $0 either way, and Windows will get software support for longer.

            My mom is happily using a Lenovo from 2013 and looking to upgrade because it doesn't support Windows 11 and Win10 is running out of support. A contemporary Mac would have been the 2012 Mac Mini which would have received its final OS update with 10.15 Catalina in 2019, and would have received its final security update in 2022. (Desktop, so no difference in peripherals, etc.)

            Incidentally, I actually purchased both the Lenovo and a 2012 Mac Mini (refurb) so I have the pricing data - the Lenovo was $540 and the Mac Mini was $730 - and then both took aftermarket memory and SSD upgrades.

          • whatever1 2 days ago

            If your $1000 MacBook breaks after a year you need $1000 to repair it.

            A 500 laptop is probably more repairable and worst case you pay $500 to get a new one. Not to mention battery replacement etc.

            The expected total cost of ownership is very high for a Mac. It’s like owning a Mercedes. Maybe you can afford to buy one, but you cannot afford maintenance.

            • sgarland a day ago

              As a sibling comment said, what maintenance? The only problem I’ve ever had with any Mac was a bad keyboard on my M4 MBP, and that showed itself so quickly that even without AppleCare it would have been covered.

              Between work and personal, I’ve had an Intel Air, 2x Intel Pros, M1 Air, 2x M3 Pros, and an M4 Pro. My wife has an M1 Air. My in-laws have an M3 iMac. My mom has… some version of an Apple Silicon laptop.

              That is a decent amount of computers stretching over many years. The only maintenance required has been the aforementioned keyboard.

              • ajross 13 hours ago

                Oh, come on. Laptops are mobile devices that live in bags and backpacks and they break all the time. I've had more laptop failures than cracked phones, even. You absolutely need an answer for "what happens if my screen gets cracked", just ask any college student. Windows junk is cheaper, it just is.

                In pre-college education, the answer is often "use any other junky Chromebook from anywhere in the world", which is cheaper still.

                • sgarland 2 hours ago

                  Maybe you need a better backpack? I’ve had zero cracked screens. I’ve also never cracked my phone screen, though, so there’s that.

                  I did drop my watch last week, and the second hand fell off, though.

            • culopatin a day ago

              What maintenance? AppleCare also exists if you worry about such things.

          • serf 2 days ago

            larger initial purchases are harder on the lower income earners regardless of the long term value they offer; that's one of the hard parts about being poor, it also makes positive economic decisions harder to accomplish.

          • dangus a day ago

            That just means that the not-Mac is way more accessible. The high resale value makes Macs more expensive overall for everybody.

            Also a lot of people prefer windows. It’s got a lot more applications than Mac. It has way more historical enterprise support and management capability as well. If you had a Mac at a big company 20 years ago the IT tooling was trash compared to windows. It’s probably still inferior to this day.

            • criddell a day ago

              > It’s got a lot more applications than Mac.

              The Mac can (legally) run more software than any other computer. Aside from macOS software, there's a bunch of iOS and iPadOS software that you can run, and you can run a Windows, Linux, and Android software via VMs.

              • dangus a day ago

                Yeah…I don’t think so. Moving the goalposts to include Parallels/VMs and iOS/iPadOS apps that lack a touch screen on on Mac and are frequently blocked from being run on Mac by developers doesn’t count.

                Let’s not forget that you’re now talking about buying a $100/year license; in just a few years you could buy a whole Windows computer with a permanent license for that money.

                And if you’re going to talk about how great VMs are on Mac we can’t leave out how it’s the worst Docker/podman platform available.

      • ajross 2 days ago

        I won't buy or recommend one just on principle. I've spent way too much of my life advocating for open firmware and user-customizable systems to throw it all in the trash for a few hours of battery. I absolutely tell everyone they're the best, and why, but my daily driver has been a Linux box of some form (OK fine I have a windows rig for gaming too) for decades, and that's not changing.

        Also, again, most folks just don't care. And of the remainder:

        > Compiles code twice as fast as my new Windows desktop

        That's because MS's filesystem layer has been garbage since NT was launched decades ago and they've never managed to catch up. Also if you're not apples/applesing and are measuring native C/C++ builds: VS is an OK optimizer but lags clang badly in build speed. The actual CPU is faster, but not by nearly 2x.

        • kstrauser 2 days ago

          >> Compiles code twice as fast as my new Windows desktop

          >That's because MS's filesystem layer has been garbage since NT was launched decades ago [...]

          I confess that this kind of excuse drives me batty. End users don't buy CPUs and buy filesystems. They buy entire systems. "Well, it's not really that much faster, it's just that part of the system is junk. The rest is comparable!" That may be, but the end result for the person you're talking to is that their Windows PC compiles code at half the speed of their Mac. It's not like they bought it and selected the glacial filesystem, or even had a choice in the matter.

          That's right up there with "my Intel integrated graphics gets lower FPS than my Nvidia card." "But the CPU is faster!" Possibly true, but still totally irrelevant if the rest of the system can't keep up.

          • aleph_minus_one 2 days ago

            > End users don't buy CPUs and buy filesystems. They buy entire systems. [...] Possibly true, but still totally irrelevant if the rest of the system can't keep up.

            At least historically for hardware components of PCs, this was not irrelevant, but the state of things:

            You basically bought some PC as a starting basis. Because of the high speed of improvements, everybody knew that you would soon replace parts as you deemed feasible. If some component was not suitable anymore, you swapped it (upgrade the PC). You bought a new PC if things got insanely outdated, and updating was not worth the money anymore. With this new PC, the cycle of updating components started back from the beginning.

            • kstrauser 2 days ago

              But that still doesn't save away, "oh, it's only slow because the filesystem is so slow". Assuming that's true, that's a very integral part of the system that can't readily be swapped out by most people. You can't say "the system is actually really fast, it's just the OS that's slow", because the end result is just plain "the system is slow."

              • aleph_minus_one 2 days ago

                > You can't say "the system is actually really fast, it's just the OS that's slow", because the end result is just plain "the system is slow."

                If performance is so critical, people do find ways around this. Just to give an arbitrary example since you mention file systems:

                Oracle implemented their own filesystem (ASM Cluster File System (ACFS)):

                > https://en.wikipedia.org/w/index.php?title=Oracle_Cloud_File...

                "ACFS provides direct I/O for Oracle database I/O workloads. ACFS implements indirect I/O however for general purpose files that typically perform small I/O for better response time."

          • ajross a day ago

            > I confess that this kind of excuse drives me batty.

            The use case was "compiling code". My assumption was that anyone buying hardware for that application would understand stuff like filesystem performance, system tuning, also stuff like "how to use a ramdisk" or "how to install Linux".

            Yes, if you want to flame about the whole system experience: Apple's is the best, period. But not because they're twice as fast, that's ridiculous.

ed_blackburn a day ago

I keep asking, surely, I'm not the only potential customer: where's my ARM 64, Linux Mac mini equivalent? I'd settle for a laptop form if I must.

ktosobcy a day ago

I like the rise of ARM… first AMD and then "PC world" followed suite with Snapdragon. With the latest Elite I think that my next machine (after MPB M1, which is still perfectly fine machine in 2025) will be PC laptop with ~Snapdragon (or something similar) with Linux... I'm fedup with Apples more and more stupid changes and limitations :/ (I actually disabled automatic updates to avoid LiquidTahoe!)

For me cool and silent device is more important than raw power, and we are in moment where the CPU is not that important (more so - memory)

  • torginus a day ago

    Dunno I remember reading an article about one of the first Snapdragon PCs - they concluded that while the CPU side was pretty good (including, surprisingly availablity of native ARM ports, and x86 emulation), the GPU was far behind typical PC offerings in sophistication, and was unable to run the complex and compute shader heavy workloads required by modern games, shader compilers were buggy, and advanced features were missing or poorly implemented, which made it an expensive word processing machine.

    I'm not sure if they fixed their newer GPUs but they really ought to.

    • kakacik a day ago

      You get cca the same graphics as top android phones get, which is pretty decent. Of course it won't compete with 5090 but who would expect that

      • torginus a day ago

        That's the whole point - these GPUs have impressive paper scores, but are designed with traditional, outdated ideas of shader pipelines in mind, and either cannot support flexible modern shading models/compute, or these features have tons of bugs, or end up being being much slower than the paper specs would suggest.

        Essentially they can't run most PC games/ graphics software made in the past decade

  • chk84us a day ago

    > I actually disabled automatic updates to avoid LiquidTahoe

    Is there even a way to automatically upgrade to a major macOS version? Automatic update settings are only for minor/patch and security responses.

    • wpm a day ago

      Yeah but even for device administrators using bleeding edge “Declarative Device Management” the “your Mac must be on this version by this time” thing is really hit or miss. I was testing it yesterday just telling a test computer to update to 15.7.1 and while I got nagged to hell, it never actually rebooted to start the update. This is despite claims in each of the nag messages that it would.

      At least Apple is still somewhat deferential to the user experience.

    • ktosobcy a day ago

      Hmm... that may be true. I just disabled everything in preferences to avoid it completely...

  • heavyset_go a day ago

    > will be PC laptop with ~Snapdragon (or something similar) with Linux...

    Before you do that really research the exact model you plan on buying, and how much support it truly has in Linux. Qualcomm has not exactly lived up to their own hyped intentions of good support on Linux, and that provides a foundation of sand for hardware vendors to further drop the ball on when it comes to Linux support.

    Apparently, some models built on the Elite platform have reportedly great support, while others don't. My experience with Linux on ARM SoCs, outside of servers, has made me very wary about using consumer products for desktop Linux with them. You really don't want to be dependent on the vendor, or some random people in their free time, for their custom kernel forks or Linux images, and that seems to be the trend in the non-server ARM boards + Linux ecosystem.

    • mort96 a day ago

      100%. A huge issue with the ARM ecosystem is that there's no widely adopted UEFI-like standard, so there's no runtime discovery of hardware configuration and things like the booting process and partitioning are extremely different between different machines. You pretty much need a Linux image built specifically for the particular hardware, unlike in the x86 world where you just need a Linux image which can boot using UEFI and discover all relevant hardware through either UEFI, ACPI and PCI.

      • ktosobcy a day ago

        Are there any plans to have similar solution?

        Is this the reason, why the Android echosystem is so abysmal when it comes to hardware driver support?

        • heavyset_go 21 hours ago

          The ARM server ecosystem has it solved with SBSA, which includes ACPI, UEFI, etc as part of the standard.

          There are some ARM consumer computers that implement UEFI, etc, though.

          The issue is that it's cheaper to just put together bespoke boards and not implement industry standards that are expected on PCs. Vendors don't have to worry about supporting anything other than the hardware and images they ship, so they don't see a reason to make third party OS support viable.

          > Is this the reason, why the Android echosystem is so abysmal when it comes to hardware driver support?

          Theoretically, Android should have all of the same drivers as Linux, along with Android-specific drivers that are abstracted over stable driver interface instead of the unstable Linux interface.

          What usually happens is the plethora of drivers that vanilla Linux ships with aren't built for, and distributed with, phones/tablets, and the Android drivers only have to support that specific board's hardware. Then, over time, the kernel doesn't get updated so new hardware won't be supported the longer you use the device.

          If you're asking why Android devices have poor support for Linux, the OP's answer is the exact reason why.

    • ktosobcy a day ago

      Thanks for the heads up. In general Linux hardware support was for a long time hit-or-miss so I kinda expected that. On the other hand I said that with the hope that the things will improve significantly by the time I will be upgrading the machine (which is still years away; I stayed with previous MBP for 8 years so I still have like 3-5 years to go :D)

      • heavyset_go 21 hours ago

        These days x86 PCs are well supported on Linux, the kinds of problems you can encounter with ARM boards can be unique and just don't exist on PCs for a variety of reasons.

        • ktosobcy 20 hours ago

          Well, I don't have recent experience with Linux on real hardware since quite a long time but I'm aware that the situation improve significantly to the point where it's mostly no-issue (save for GPU drivers, ekhm) so my hope is same will happen with ARM.

  • podlp a day ago

    Are there viable options? I run Asahi on my M1 Air and it performs quite well, but certain software like Android Studio has issues and hiccups. I’d love a decent ultrabook to replace my Air, but I hate Windows and got overwhelmed with reading Reddit forums to know which PCs will and won’t work with Linux. I don’t mind doing some setup, but I’d rather not be disappointed by non-functional or missing drivers for WiFi, Bluetooth, display, etc.

modeless 2 days ago

Intel's best single core score is listed as 3240 for the Core i9-14900KS, a 250W desktop monster chip. Is this score of 4133 for an equivalent test? Is Intel that far behind?

  • RachelF 2 days ago

    Yes. AMD is a little better.

    But bear in mind that Geekbench runs very short benchmarks for single core especially, so that the CPU never starts thermal throttling.

    The Apple chips are fast and efficient, and "feel" even faster because of the their single core burst performance and on chip very fast RAM.

    • satellitemx a day ago

      Cinebench 2024 takes considerably longer to finish and would saturate the whatever possible heat headroom the device has. It can be treated as an accurate reading.

      Chart below is the aggregated result from CPU-monkey, Geekerwan's chip analysis, devices of my own and various other reports.

      Apple M1 series 3.2GHz 5W: ~115 Apple M2 series 3.5GHz 5W: ~120 Apple M3 series 4GHz 7.5W: ~140 Apple M4 series 4.4GHz 7.5W: ~170

      Snapdragon X1E 4.3GHz 10-20W: ~140 Snapdragon X2E 5GHz >20W: ~160

      AMD 9950X 5.7GHz (Desktop Best): ~140 AMD AI 9 HX 375 5.1GHz (Laptop Best): ~125

      Intel Ultra 9 285K 5.7GHz (Desktop Best): ~145 Intel Ultra 9 285HX 5.4GHz (Laptop Best): ~135 Intel Ultra 9 288V 5.1GHz (Low Power Laptop Best): ~125

      Apple M5 may be the first ever CPU to near ~200pts on Cinebench single-core while still maintaining less than 10W of core power draw. Competitors lose on all fronts by about 2 or even 3 generations at their respective device class.

      • seec a day ago

        Using Cinebench single score to compare desktop chips is very dishonest. It's an extremely parallel benchmark, where many powerful cores get you a much better score.

        For example, the M4 does get around 170 single, but the Snapdragon X2E gets just under 2000 for multi, over double what the M4 scores. If your application is relevant to Cinebench, the X2E is a better CPU for that. To match the X2E you need to go up to the 16 cores M4 Max.

        The 16 cores variant of M4 Max is only available on a 16inch MBP that starts at 4.8K€; it's not clear how much the X2E laptops will cost but I would bet a lot of money that it's going to be much less than that...

        As for the desktop's parts, the only Apple product that beats the typical X86 CPUs in multi, is actually the M3 Ultra which is pretty bad deal because it doesn't scale very well in other ways (GPU). Otherwise, Intel (i9s/Core Ultra 9) and AMD (Ryzen 9) still hold the crowns in pure multicore score.

        The score of an M4 Max 16 cores, actually puts you down in Core Ultra 7 265K territory. You can put together a system based around that CPU and a 5070Ti GPU (that raw bench around the same as the M4 Max 40 cores variant but will actually perform much better for most things) for a full 1200€ less than a Mac Studio equipped like that (it even has the luxury of double the storage). If you don't need dedicated GPU power or could do with a low-end GPU the savings would be between 2000-1700€ (the 5070Ti is an 800€ part).

        Let's be real, the Apple Silicon SoCs are very power efficient but they are definitely not performance maxing and they are even less money efficient. It is very suspicious arguing about top performance while ignoring multicore.

        Now here is another fact: the M4 Max 16 cores can draw more power than the 140W power adapter included with the 16-inch MPB. It has a battery capacity of 100Wh. If you run the things at full tilt or near that, it will actually have a runtime of less than an hour. It's actually funny because the Apple afficionados keep singing the praise of Apple Silicon and many have been burned by that unexpected fact. It's easy to shit on the high-power gamer type laptop that can't run well on battery but that's actually true as well if you use the full power of an Apple Silicon laptop. You might get like half an hour more runtime but that's basically irrelevant.

        The reality is that everyone singing that Apple Silicon efficiency praise don't have truly demanding workloads otherwise the battery life wouldn't be a meaningful difference.

        High performance laptops don't make a lot of sense whether they are Apples or other brands. They are mostly convenience products (for docking/undocking) or status symbols.

        • satellitemx a day ago

          It’s to compare core architecture brother. The fact is Apple’s core has best IPC, has best efficiency, has best peak performance.

          And you put out a long long long post to point out what everyone understands: putting more cores in and running at a lower frequency would yield better efficiency at full load… that’s why we got to the point in today’s x86 laptops, a single core running at full speed already exceeds sustained multicore power target (28-60w depend on device class) because Intel and AMD has no other way to up the performance other than adding more cores.

  • BearOso a day ago

    It's all up in the air when you have completely different environments. These benchmarks are mostly useful for comparing similar chips from generation to generation.

alberth 2 days ago

Looks like an improvement over the M4 iPad of:

  Single Core: ~12%   (3679 vs 4133)
  Multi Core:  ~15%   (13420 vs 15437)
Which is in alignment historically with improvements from a new node improvement.

https://browser.geekbench.com/ios_devices/ipad-pro-13-inch-m...

  • pier25 2 days ago

    > since M5 will be on the new (smaller) 2nm node

    AFAIK the M5 is still 3nm being produced at the TSMC N3P node.

    • alberth 2 days ago

      Thanks, I updated my wording as such.

    • red369 2 days ago

      I frequently forget that the 2 nm naming has nothing to do with the physical size anymore. I hate that naming system.

      From Wikipedia: Node name Gate pitch Metal pitch Year 5 nm 51 nm 30 nm 2020 3 nm 48 nm 24 nm 2022 2 nm 45 nm 20 nm 2025 1 nm 40 nm 16 nm 2027

      • pier25 2 days ago

        I'm no expert but I think it's more related to the resolution of the photolithography process.

ksec 2 days ago

iPad M5 vs M4 [1], this is coming from leaked unbox video of M5 iPad Pro. So it should be legit.

Single-Core Score 4133 3748 110.3%

Multi-Core Score 15437 13324 115.9%

Same maximum clock speed. So assuming no special thermals solution on the new iPad Pro such as vapour chamber. This is 10% pure IPC improvements although the M5 has 6MB L2 Cache. 2MB higher than M4.

Not shown here are the E-Core performance. Which if we were to trust the A19 Pro test they are 20 to 30% higher than previous generations. And GPU is also a lot faster on A19 Pro.

M5 also comes with 12GB Memory as baseline, which is 4GB higher than M4 you get on iPad Pro. I hope M5 MacBook Air continues to get 16GB as baseline, looking like a very decent upgrade for anyone that is on M1 or still on Intel platform. Would be perfect if MacBook Air gets Vapour Chamber like iPhone Pro. I don't mind paying slightly more for it.

[1] https://browser.geekbench.com/v6/cpu/compare/14173685?baseli...

  • BolexNOLA 2 days ago

    Frankly it’s borderline unethical that they offered 8gb models past like…M1. 12gb isn’t enough IMO for anyone who wants a machine that will last but at least it’s a step in the right direction.

    My friend with an M3 MacBook was complaining about the speed. I told them that was ridiculous and they must be doing something incredibly intensive. I came and took a look at it - I know chrome tabs are a memory hog but my God this thing slowed to a crawl with even lightweight usage despite being weeks old. I told him to return it immediately.

    • astrange 2 days ago

      It should not have been noticeably slow no matter what you did. It'd run out of memory with enough tabs of course.

      The fastest way to get more memory is an ad blocker.

    • acdha 2 days ago

      > I know chrome tabs are a memory hog but my God this thing slowed to a crawl with even lightweight usage despite being weeks old. I told him to return it immediately.

      Wouldn’t it be easier to use Firefox or Safari? Chrome is a hog but it’s not like we don’t have multiple great alternatives which also use something like half of the battery and don’t oppose privacy measures.

      • BolexNOLA 2 days ago

        Not my computer, I’m not going to say “don’t like your computer? Change browsers.” They returned it, got a little more ram, and they’re happy now. I don’t know what their use case is that in-depth so I’m not going to tell them to completely change their browsing habits and usage unless they are open to that. $200 later they were happy.

        Given the field they are in, I imagine they use chrome like many do for compatibility/testing reasons

    • ksec 2 days ago

      Chrome, for at least the past 2 years have drastically improved multiple Tab memory usage. It also now default unload tabs which is unused just like Firefox. While it still isn't as good as Firefox in 100+ Tabs scenario Chrome is not too far off. Unlike Safari which as of version 26 still doesn't care much about multi tabs usage.

    • jeffbee 2 days ago

      I am totally satisfied with the utility of my Apple Silicon Mac mini that is now 5 years old. Calling it "unethical" shows you have a weird point of view out of touch with mainstream use cases.

      • zf00002 2 days ago

        Reading on my 8gb M2 Air that I have not once ever felt is lacking.

        • BolexNOLA 2 days ago

          Glad you’ve had a good experience, maybe we just got a bad computer. But I know what I saw ultimately and what it was was a brand new M3 machine running like crap.

          • sys_64738 2 days ago

            Try installing Vitals.app open source app to see what's going on:

            https://github.com/hmarr/vitals

            Stats is another good one too:

            https://github.com/exelban/stats

            • saagarjha a day ago

              Why not just use Activity Monitor

              • BolexNOLA 20 hours ago

                Honestly combed it over and it didn’t tell us anything other than high mem utilization/pressure. I’m not sure what the deal is but I’ve found AM on Mac’s almost…deceptive? It feels like I’m not getting the full story sometimes. We use one at my office for live streams and the math on resources often does not check out. Our head of engineering agrees that his experience with the silicon chips has been that they do not like to let go of tasks/processes and can be a little opaque about what it is doing in the background.

                I know it sounds a lot like vibes, because it kind of is, but it’s just what we’ve seen.

                • saagarjha 18 hours ago

                  Why are the tools you linked better though, they mostly use the same APIs

                  • BolexNOLA 15 hours ago

                    I think you’re mixing me up with sys. I didn’t link anything

      • BolexNOLA 2 days ago

        No need for a personal attack dude.

zamadatix 2 days ago

Lines up about exactly with the improvements in the A18 Pro i.e. ~+10% single and ~+15% multi.

  • runjake 2 days ago

    And the GPU is up by about ~30%! This falls in line with the recent jumps in the A chips as well.

zmmmmm a day ago

The real question for me is how much they target GPU support towards accelerating language and vision models in the Macbook Pro lineup. I don't know if Apple actually cares but they've got a huge opportunity to steal the spotlight from nVidia if they make inference competitive with nVidia chips.

  • YmiYugy a day ago

    Nvidia's spotlight comes from $30k datacenter GPUs and network equipment. So unless Apple starts making those, nobody but a few privacy conscious enthusiasts will care about running some mediocre models at glacial speeds on their MacBook.

    • zmmmmm a day ago

      sure - that's the data center model and nVidia will probably always own that.

      But I disagree that only privacy conscious enthusiasts will want to run locally in the end. Right now in the hype froth and bubble and while SOTA advances fast enough that getting the very latest hosted model is super compelling, nobody cares about anything. Longer term, especially once hosted services start deciding to try and make real money (read: ads, tracking, data mining, etc) this is going to change a lot. If you can get close to the same performance locally without data security issues, ads or expensive subscriptions, I think it will be very popular. And Apple is almost uniquely positioned to exploit this once the pendulum swings back that way.

ZuLuuuuuu a day ago

Looks like M5 Macbooks will still have an edge over PC equivalents, but I am glad that with the new Qualcomm CPUs, PCs are at least getting close. Unfortunately Intel and AMD are falling so much behind that they cannot compete on laptop form factor anymore. The best Intel and AMD laptop CPUs, regardless of their TDP, are still around 3000 score on Geekbench single core.

stakhanov a day ago

Is the Mac Pro pretty much no longer a thing, going forward? -- Not trying to be a smartass, just asking out of genuine curiosity, because I know next to nothing about the Apple lineup. But the naming ("M2" vs "M5") would seem to suggest it's 3 generations behind the latest?

  • wlesieutre a day ago

    Yes, it mostly exists for people who absolutely require PCI slots, except you can’t use AMD/Nvidia GPUs anymore so the utility of that is limited.

    Apple has their “Afterburner” card for ProRes media encoding, you could add even more ports, or there’s probably weird AV interface cards, but the vast majority of people can save a few thousand dollars and get a Mac Studio instead.

    Since the Mac Studio has more than 10 customers it gets updated more frequently.

    There were rumors about the Mac Pro getting a higher tier of “we stuck twice as many cores together in the SoC” but it didn’t pan out, likely not worth the development time compared to the higher volume products. But it could hypothetically still happen.

  • HSO a day ago

    the next mac pro (presumably next mar/apr) is the first that comes a full 3-year product cycle after ai hype started.

    therefore I expect that mac pro (and in similar vein mac studio) will be repositioned as ai/ml dev machine, with apple leaning into their lucky strike of UMA fit with modern requirements.

    my bet is m5 extreme exclusive to mac pro and 1 tb possibly even 2 tb ram, and mac studio limited to m5 ultra and 1 tb ram on the high ends.

    but thats not based on rumors or "news" of any sort, just from logic extrapolated if i were in apple shoes

criddell 2 days ago

I bought an M4 iPad Pro and ended up returning it because I just don't like the magic keyboard. My current iPad is from 2018 and I use the Smart Keyboard Folio and (for me) it's just about perfect. Small, lightweight, not too expensive, easy to clean, and works great.

I've been hoping there were enough people like me that a third party would make a replacement but that never happened.

I know my current iPad Pro won't last forever so I suppose I'll end up with a Magic Keyboard setup eventually.

  • wintermutestwin a day ago

    I just don't understand the iPad + Magic Keyboard. Combined, they weigh as much as a macbook, which has a much better pad+KB and an OS that isn't crippled.

  • tylerflick 2 days ago

    Still rocking a 2018 pro as well. At this point I would pay Apple for OS upgrades as I don’t see any reason to buy a new model.

    • sandbags 2 days ago

      Likewise. It still works fine for all the things I use it for. Long may it last!

  • bogdart 2 days ago

    Just don't update it to iOs 26, for me it became unusable.

    • criddell a day ago

      I already did and it seems fine to me. I don't really notice much difference.

razighter777 2 days ago

Shame this fantastic, carefully engineered and unique technology is locked behind closed source drivers, non-upgradable hardware and tamper resistent boot processes. I love apple silicon... if only they would launch some sort of product or documentation allowing M-series processors to be practical for more general purpose computing...

  • CharlesW 2 days ago

    >, if only they would launch some sort of product or documentation allowing M-series processors to be practical for more general purpose computing...

    Apple isn't the only company who can do this, but the reason they'll continue to have the lead for the foreseeable future is for all the reasons you dislike them. This is the benefit of near-complete vertical integration.

  • netule 2 days ago

    I love being able to run local LLMs with decent performance on a laptop without external hardware, but it would be really nice if there was better gaming support.

  • viktorcode 2 days ago

    > and tamper resistent boot processes

    Bootloader is unlocked on Macs. That's how Asahi Linux started

amelius 2 days ago

To what extent can these improvements be ascribed to Apple, as opposed to the particular silicon fab process node?

  • nchmy 2 days ago

    I'm also curious about this. And, even moreso - did apple just start designing chips from scratch or did they buy someone who already had some cutting edge technology? Its hard to believe that they're beating everyone just out of nowhere...

    • dagmx 2 days ago

      It’s not out of nowhere. It’s just that people didn’t pay attention to their mobile chips.

      But Anandtech had articles as far back as the A12 7 years ago where it was competing with the intel chips of the era

      A secondhand link because anandtech is restructured now unfortunately https://appleinsider.com/articles/18/10/05/apples-a12-bionic...

      • astrange 2 days ago

        > because anandtech is restructured now unfortunately

        …because he left to work on the chips.

        • dagmx a day ago

          The changes to the site are very recent, and they continued doing stellar reports after he left too.

        • amelius a day ago

          and silenced by Apple, I guess.

    • criddell 2 days ago

      Right after the launch of the original iPhone, Apple bought P.A. Semi for $27 billion to work on chips for future iPhones.

      • kridsdale1 2 days ago

        A lot of those guys had a BBQ at my apartment complex for a celebration hosted by my roommate, who worked at Apple’s Silicon Team with them. Lots of cool things to talk to them about. This was around the time of the iPhone 7.

        A huge number of them were Iranian.

    • robertjpayne a day ago

      This is the payoff of nearly over a decade of R&D being poured into their chip development.

      Apple is a vertical integration company and the CPU/GPU in their devices was a clear sore spot of relying on external vendors they did not like.

      You can see their lead continuing and growing with things like their new modem used in some iPhones

Zak 2 days ago

I'm not especially impressed that Apple came up with a mobile CPU that pretty much doubles the performance of the three year old Ryzen 6850u in my Thinkpad. What I'm impressed with is it's doing that in an iPad, which presumably doesn't have a fan.

megamix a day ago

Emphasis on "leaked". Or is this just a convenient word to use? How do these things leak? Thanks ! :)

michelb a day ago

I'm excited to see what the newly added matmul accelerators will do to the pro versions of these chips.

commandersaki 2 days ago

Anyone thinking Apple October event? New iPad, budget Macbook?

nullbyte 2 days ago

I am very curious how GPU performance will be, especially for AI tasks

grigio 2 days ago

if only it could run Linux..

  • urbandw311er 16 hours ago

    I mean it kind of does under the hood right? Isn’t MacOS and by extension iPadOS a sort of relative of Unix?

ceayo 2 days ago

Why would it have 9 cores? Feels a bit weird to have an (a) uneven and (b) not a power of two amount of cores on a processor.

  • zamadatix 2 days ago

    Some models of the M4 iPad had 3P + 6E = 9 cores as well, so it's certainly not unusual. Like another commenter said, the "why" can be binning for chips which come out with a broken P core.

  • bayindirh 2 days ago

    Having one core dedicated to coordination tasks and maybe to get all interrupts when other 8 are churning things is always a good idea from my perspective (HPC admin and programmer).

    Also Apple is not shy of odd numbered cores. iPad 2 had a tri-core GPU for example.

  • dylan604 2 days ago

    AMD used to have a 3 core desktop CPU. IIRC it was because one of the intended 4 cores was bad but they could salvage the package by making it 3 cores. The 3 core option was so popular, that they kept it once the 4th core issues were resolve with a software patch. Clever people figured out how to enable it once installed and got a 4th core for free. Never used it myself, and only heard tales about it. Could be urban legend.

    • mrheosuper a day ago

      It is true. I remember when the process became more mature, almost all 3 cores CPU can be unlocked.

esjeon a day ago

It's about 10% increase from M4, which is impressive. But, hey, I don't think I'm not even using 10% of the M4 processing power on my iPad. What's the point?

Apple seriously need to open up development on/for iPadOS, or it'll become something you by every decade or so. Who needs new model if people can't even utilize what they have right now?

l5870uoo9y 2 days ago

Unrelated question, where are these cores produced?

caycep a day ago

this is still TSMC n3p?

qmr 2 days ago

Looking forward to how programmers manage to waste and abuse this power.

  • jp0d 2 days ago

    Care to elaborate?

maz1b 2 days ago

Where's the M4 Ultra/M4 Extreme?

M5 Ultra/M5 Extreme/M5 Super?

  • jyap 2 days ago

    Per the title, this is a leaked benchmark. It’s not an Ars Technica full benchmark article.

  • pier25 2 days ago

    There won't be an M4 Ultra as the chip was not designed with a fusion connector.

    The Extreme versions were only rumors. But maybe Apple will finally make one with the M5 and release a proper Apple Silicon Mac Pro.

    • risho 2 days ago

      >There won't be an M4 Ultra as the chip was not designed with a fusion connector.

      which is the same thing that people said about the m3

      • pier25 2 days ago

        Really? The M3 Ultra still uses the ultra fusion connector.

Luker88 a day ago

"...but does it run Linux?"

aspenmayer 2 days ago

Related, but not yet posted to HN that I know of:

Leaked unboxing video reveals unannounced M5 iPad Pro in full - https://9to5mac.com/2025/09/30/leaked-unboxing-video-reveals...

https://x.com/markgurman/status/1973048229932507518 | https://xcancel.com/markgurman/status/1973048229932507518

Exclusive! Unboxing the iPad Pro with the M5 before Apple! - https://www.youtube.com/watch?v=XnzkC2q-iGI

  • dylan604 2 days ago

    > Exclusive! Unboxing the iPad Pro with the M5 before Apple!

    Big boy is bitching about a meager 10% increase in CPU and 30% increase in GPU as a nothing burger. "Who would upgrade from M4 to M5?" exactly. The difference is when you upgrade from older to latest. Most people do not upgrade annually. I'm looking to replace my 6th gen tablet, but now I might just get an m4 after the m5 is official and get a nice discount on what will be a helluva upgrade for me.

    Some of the comments in the threads you linked also suggest Russia has infiltrated Apple, but my guess would be some where in the Chinese side of the supply chain.

    • larusso 2 days ago

      I also don’t understand what people expect. Also it seems that everybody is constantly grilling their machines. Otherwise I can’t understand this need for even more power. I have the iPhone 16ProMax and upgraded from a 13ProMax. I didn’t really feel a difference. Mainly because I don’t use my phone for high performance applications or gaming.

      [edit] typo

      • pitched a day ago

        I recently upgraded to a 15 (from an SE) and found that the power efficiency is very noticeable. I can leave home in the morning at 50% and still expect it to last the day. Like you said though, I don’t know if the gain in peak performance is noticeable at all, outside of how it might help power efficiency.

    • thenthenthen a day ago

      The second video posted shows what looks like a Chinese style charger.

itopaloglu83 2 days ago

I can’t wait to suffer M5 with macOS 26, such a disappointment.

Edit: Let me double down, macOS 26 is the worst OS that Apple has shipped in the last two decades.

  • pitched a day ago

    I am very unhappy with Liquid Glass but it’s been extremely stable, as far as I’ve heard so far?

lvl155 2 days ago

Too bad Apple refuses to innovate on AI. Epic management failure.

  • robertjpayne a day ago

    Apple doesn't tend to enter unprofitable markets early. Name one single AI company that is innovating and profitable?

    All the premier modal providers are losing money hand over fist even at $100's a month in subscription fees.

  • sroussey 2 days ago

    Blame it on the bean counters.

    • ebbi 2 days ago

      That's on Tim Cook. He could have easily overridden that decision if he deemed it important enough.