Asahi is awesome!
But this is also proves that laptops outside the MacBook realm really need to improve so much. I wish there were a Linux machine with the hardware quality of a MacBook
* x86 chips can surpass the M series cpus in multithreaded performance, but are still lagging in singlethreaded performance and power efficiency
* Qualcomm kinda fumbled the Snapdragon X Elite launch with nonexistent Linux support and shoddy Windows stability, but here's to hoping that they "turn over a new leaf" with the X2.
Actually, some Snapdragon X Elite laptops do run Linux now, but performance is not great as there were some weird regressions and anyway newer chips have caught up [1].
On the build quality side, basically all the PCs are still lagging behind Apple, e.g. yesterday's rant post about the Framework laptop [2] touched on a lot of important points.
Of course, there are the Thinkpads, which are still built decently but are quite expensive. Some of the Chinese laptops like the Honor MagicBooks could be attractive and some reddit threads confirm getting Linux working on them, but they are hard to get in the US. That said, at least many non-Apple laptops have decent trackpads and really nice screens nowadays.
I have no faith in Qualcomm to even make me basic gestures towards the Linux community.
All I want is an easy way to install Linux on one of the numerous Snapdragon laptops. I think the Snapdragon Thinkpad might work, but none of the other really do.
A 400$ Arm laptop with good Linux support would be great, but it's never ever going to happen.
Facts are Linux support has heavily accelerated from both Qualcomm and Linaro on their behalf. Anyone who watches Linux ARM mailing lists can attest that.
Hardware has already been out for a year. Outside a custom spin by the ubuntu folks, even last years notebooks arent well supported out of the box on linux. I have a Yoga Slim 7x and I tried the Ubuntu spin out at some point - it required me to first extract the firmware from the Windows partition because Qualcomm had not upstreamed it into linux-firmware. Hard to take Qualcomm seriously when the situation is like this.
Qualcomm _does_ upstream all their firmware, but vendors usually require a firmware binary to be signed with their keys, which are burned into the SoC. As a result you cannot use Qualcomm's vanilla firmware and need to extract the original firmware as provided by the vendor, otherwise it won't load. This is an actual security feature, believe it or not. Besides, chances are it wasn't even Qualcomm's firmware, but rather Cirrus for sound or display firmware, etc.
I get the hate on Qualcomm, but you're really one LLM question away from understanding why they do this. I should know, I was also getting frustrated before I read up on this.
I get where youre coming from but I think the job of a company pushing a platform is to make it "boring". ie it should work out of the box on debian/fedora/arch/ubuntu. The platform vendor (Qualcomm) is the only one with enough sway to push the different laptop manufacturers do the right thing. This is the reason why both Intel / Windows push compliance suites which have a long list of requirmements before anyone can put the Windows / Intel logo on their device. If Qualcomm is going to let Acer / Lenovo decide if things work out of the box on linux then its never going to happen.
Can you please let me know if there is an ISO to get any mainstream Linux distro working on this Snapdragon laptop ?
ASUS - Vivobook 14 14" FHD+ Laptop - Copilot+ PC - Snapdragon X
It's on sale for $350 at Best buy and if I can get Linux working on it it would definitely be an awesome gift for myself.
Even if there's some progress being made, it's still nearly impossible to install a typical Linux distro on one of these. I've been watching this space since the snapdragon laptops were announced. Tuxedo giving up and canceling their Snapdragon Linux laptop doesn't instill much confidence
That covers the Elite, not the cheaper Snapdragon X laptops such as the ASUS Vivobook 14 (X1407QA).
I've followed that thread for almost a year. It's a maze of hardware issues and poor compatibility.
From your other response.
>but vendors usually require a firmware binary to be signed with their keys, which are burned into the SoC. As a result you cannot use Qualcomm's vanilla firmware and need to extract the original firmware as provided by the vendor, otherwise it won't load.
This makes the install process impossible without an existing Windows install. It's easier to say it doesn't work and move on.
It's going to be significantly easier to buy run Linux in an X86 laptop.
Not to mention no out of the box Linux Snapdragon Elite laptop exists. It's a shame because it would probably be an amazing product.
This sounds a lot like how AMD’s approach had changed on Linux and still everyone I know who wants to use their GPU fully used Nvidia. For a decade or more I’ve heard how AMD has turned over a new leaf and their drivers are so much better. Even geohot was going to undercut nvidia by just selling tinygrad boxes on AMD.
Then it turned out this was the usual. Nothing had changed. It was just that people online have this desire to express that “the underdog” is actually better. Not clear why because it’s never true.
AMD is still hot garbage on Linux. Geohot primarily sells “green boxes”. And the MI300x didn’t replace H100s en masse.
Maybe it's just that you're mostly viewing this through the LLM lens?
I remember having to fight with fglrx, AMDs proprietary Linux driver, for hours on end. Just to get hardware-acceleration for my desktop going! That driver was so unbearable I bought Nvidia just because I wanted their proprietary driver. Cut the fiddling time from many hours to maybe 1 or 2!
Nowadays, I run AMD because their open-source amdgpu driver means I just plonk the card into the system, and that's it. I've had to fiddle with the driver exactly zero times. The last time I used Nvidia is the distant past for me.
So - for me, their drivers are indeed "so much better".
But my usecase is sysadmin work and occasional gaming through Steam / Proton.
I ran LMStudio through ROCm, too, a few times. Worked fine, but I guess that's very much not representative for whatever people do with MI300 / H100.
I play lots of games on a AMD GPU (RX 7600) for about a year and I can't remember a game that had graphical issues (eg driver bugs).
Probably something hasn't run at some point but I can't remember what, more likely to be a Proton "issue". Your main problem will be some configuration of anti-cheat for some games.
My experience has been basically fantastic and no stress. Just check that games aren't installing some Linux build which are inevitably extremely out of date and probably wont run. Ex: human fall flat (very old, wont run), deus ex mankind divided (can't recall why but I elected to install the proton version, I think performance was poor or mouse control was funky).
I guess I don't play super-new games so YMMV there. Quick stuff I can recall, NMS, Dark Souls 1&2&3, Sekiro, Deep Rock Galactic, Halo MCC, Snow runner & Expeditions, Eurotruck, RDR1 (afaik 2 runs fine, just not got it yet), hard space ship breaker, vrising, Tombraider remaster (the first one and the new one), pacific drive, factorio, blue prince, ball x pit, dishonored uhhh - basically any kind of "small game" you could think of: exapunks, balatro, slay the spire, gwent rougemage, whatever. I know there were a bunch more I have forgotten that I played this year.
I actually can't think of a game that didn't work... Oh this is on Arch Linux, I imagine Debian etc would have issues with older Mesa, etc.
Works very well for me! YMMV maybe depending on the titles you play, but that would probably be more of a Proton issue than an AMD issue, I'd guess.
I'm not a huge gamer, so take my experience with a grain of salt. But I've racked up almost 300 hours of Witcher3 with the HQ patch on a 4k TV display using my self-compiled Gentoo kernel, and it worked totally fine. A few other games, too. So there's that!
Don’t know what LLM lens is. I had an ATI card. Miserable. Fglrx awful. I’ve tried various AMDs over the last 15 years. All total garbage compared to nvidia. Throughout this period was consistently informed of new OSS drivers blah blah. Linus says “fuck nvidia”. AMD still rubbish.
Finally, now I have 6x4090 on one machine. Just works. 1x5090 on other. Just works. And everyone I know prefers N to A. Drivers proprietary. Result great. GPU responds well.
Google has previously delivered good Linux support on Arm Chromebooks and is expected to launch unified Android+ChromeOS on Qualcomm X2 Arm devices in 2026.
> x86 chips can surpass the M series cpus in multithreaded performance, but are still lagging in singlethreaded performance
Nodding along with the rest but isn't this backwards? Are M series actually outperforming an Intel i9 P-core or Ryzen 9X in raw single-threaded performance?
Not in raw performance, no, but they're only beat out by i9s and the like, which are very power hungry. If you care even a little bit about performance per watt, the M series are far superior.
Have a look at Geekbench's results.[1] Ignore the top ones, since they're invalid and almost certainly cheated (click to check). The iPads and such lower down are all legit, but the same goes for some of the i9s inbetween.
And honestly, the fact that you have to go up to power hungry desktop processors to even find something to compete with the chip that goes in an (admittedly high-end) iPad, is somewhat embarrassing on its face, and not for Apple.
Yes, the M4 is still outperforming the desktop 9950X in single-threaded performance on several benchmarks like Geekbench and Cinebench 2024 [1]. Compared to the 9955HX, which is the same physical chip as the 9950X but lower clocked for mobile, the difference is slightly larger. But the 16 core 9950X is obviously much better than the base M4 (and even the 16 core M4 Max, which has only 12 P cores and 4 E cores) at multithreaded applications.
However, the M2 in the blog post is from 2022 and isn't quite as blazingly fast in single thread performance.
Dealing with Honor support is a pain. They don't understand absolutely anything and is impossible to get them out of their script if you have a problem.
I have a Honor 200 pro, and the software is buggy and constantly replaces user configurations with their defaults every 3 or 4 days.
I would avoid anything Honor in the future at any cost.
Honor strangely enough doesnt make any efforts to really support Linux
The machine quality is pretty damn good, but Huawei machines are still better. Apple level of quality. And Huawei releases their machines with Linux preinstalled
The company to watch is Wiko. Its their French spin off to sidestep their chip ban. They might put out some very nice laptops, but a bit tbd
The closest laptop to MacBook quality is surprisingly the Microsoft Surface Laptop.
As to x86, Zen 6 will be AMD's first major architecture rework since Apple demonstrated what is possible with wide decode. ( Well more accurately it should be since the world take notice because it happened long before M1 ). It still likely wont be close to M5 or even M4 with Single Threaded Performance / Watt, but hopefully it will be close.
> Actually, some Snapdragon X Elite laptops do run Linux now, but performance is not great as there were some weird regressions and anyway newer chips have caught up [1].
ohh thanks for that link; i was thinking about updating to the latest on my asusbook s15 but i think ill stick with the current ubuntu concept for now... saved me some trouble!
My personal beef with Thinkpads is the screen. Most of the thinkpads I’ve encountered in my life (usually pretty expensive corporate ones) had shitty FHD screens. I got too spoiled by retina screens, and I can’t comfortably use anything with lower DPI.
FWIW if you buy new from Lenovo, getting a more high-res display has been an option for years.
I'm on the other side where I've been buying Thinkpads partly because of the display. Thinkpads have for a long time been one of the few laptop options on the market where you could get a decent matte non-glare display. I value that, battery life and performance above moar pixels. Sure I want just one step above FHD so I can remote 1080p VMs and view vids in less than fullscreen at native resolution but 4K on a 14" is absolute overkill.
I think most legit motivations for wanting very high-res screens (e.g. photo and video editing, publishing, graphics design) also come with wanting or needing better quality and colors etc too, which makes very-highly-scaled mid-range monitors a pretty niche market.
> I got too spoiled by retina screens, and I can’t comfortably use anything with lower DPI.
Did you make a serious effort while having an extended break from retina screens? I'd think you would get used to it pretty quickly if you allow yourself to readjust. Many people do multi-DPI setups without issues - a 720p and a 4k side-by-side for example. It just takes acclimatizing.
I have a 14” FHD panel (158 dpi) on an old (7 year) laptop and there’s more issues with low resolution icons and paddings than with font rendering. I wouldn’t mind more, but it’s not blurry.
I just learned on Reddit the other day that people replace those screens with third party panels, bought from AliExpress for peanuts. They use panelook.com to find a compatible one.
Old Thinkpads are great! I used to have a Lenovo Thinkpad X1 Carbon Gen 6 with Intel Core i7 8640U, 16 GB of RAM, and 1 TB SSD. I installed Arch Linux on it with Sway.
> On the build quality side, basically all the PCs are still lagging behind Apple,
This is an oft-repeated meme, but not really true. Thinkpads, high-end lightweight gaming laptops like the Asus G14... There are many x86 laptops with excellent build quality.
I've moved completely to EliteBooks and am very happy with my decision. The build quality is superb, they're upgradeable, everything is replaceable and there's an excellent market and after market for parts, and HP has codepaths in their firmware for Linux support, meaning even Modern Standby works well.
Price points for refurb and used hardware are great, too.
But they’re heavier, slower, have more impactful active cooling, have much worse battery life (mostly due to the processor), and have some lower quality user interface components. Don’t get me wrong they’re decent hardware! It’s just the macbook air benchmark is very high.
The key qualities of something like a macbook air are:
It has no fans.
It's temperature never changes unless you really push it. I've never used any other laptop where I could feel at least some warmth when it was turned on.
My m1 air still has enough battery to run for a full day of usage, here several years after I bought it. Basically never loses power while the lid is closed either, but that is less of an issue.
Looking at a Thinkpad 16" P1 Gen 8 with 2X 1TB SSD, 64GB RAM, QHD+ screen, centered keyboard like MBP (i.e. no numpad), integrated Intel GPU, lightweight (4 lbs) for a little under $2.5K USD.
Closest I've found to an MBP 16" replacement.
Have been running Dell Precision laptops for many years on Linux, not sure about Lenovo build quality and battery life, but hoping it will be decent enough.
Would run Asahi if it supported M4 but looks it's a long ways away...
I'm using T14s Gen 4 Intel and sleep works for me. I'm using it in clamshell mode connected to external display 99% of the time, so I don't really use sleep all the time, but the few times I tested it, it worked. Actually every hardware peripheral, including fingerprint sensor, worked out of the box. I was pleasantly surprised by that kind of support.
I've got a relatively new p16s with a hybrid Nvidia/Intel GPU, and a p14s gen 5 with an AMD GPU, and I was able to get both of them to suspend by closing the lid. Not sure if the issue you speak of is unique to the P1 or not, but all my ThinkPads have been decent with Linux.
I’ve had issues with T14s for a couple of gens where the machine wakes up during the closed lid and runs the battery down. I’ve tried the usual troubleshooting.
This has been a non issue on Dell machines for almost 20 years.
Somewhat related yet not. I had a Dell laptop near kill itself waking up while in my backpack and near melting itself. I think I blame Windows update for this though.
This resulted in the laptop not being able to power on most of the times after that.
Oh some kernel params and other settings can help with that. These are mine, and it's been working great:
Kernel params
## Seems to be needed for suspend to S0 (s2idle) without hanging (only needed on p16s)
acpi_osi="Windows 2022"
# Prevent spurious wakeups from a firmware bug where the EC or SMU generates spurious "heartbeat" interrupts during sleep
acpi.ec_no_wakeup=1
# Prevents dock from waking up laptop right after suspend
usbcore.autosuspend=-1
Other settings (executed with a systemd service) (also only needed on p16s, not on my p14s)
I am giving my MacBook Air M2 15” to my wife and bought a Lenovo E16 with 120hz screen to run Kubuntu last night. She needed a new laptop and I am had enough of macOS and just need some stuff to work that will be easier on an intel and Linux. Also I do bookwork online so bigger screen and dedicated numpad will be nice.
It reviews well and seems like good value for money with current holiday sales but I don’t expect the same hardware quality or portability just a little more freedom. I hope I’m not too disappointed.
https://www.notebookcheck.net/Lenovo-ThinkPad-E16-G3-Review-...
If you're running desktop Linux, you will have a better experience with a rolling release than being stuck with whatever state the software that was frozen in Debian/Ubuntu is in, especially when it comes to multimedia, graphics, screen sharing, etc.
Modern desktop Linux relies on software that's being fixed and improving at a high velocity, and ironically, can be more stable than relying on a distro's fixed release cycles.
KDE Plasma, Wayland support, Pipewire, etc all have had recent fixes and improvements that you will not get to enjoy for another X months/years until Canonical pulls in those changes and freezes them for release.
Similarly, newer kernels are a must when using relatively recent hardware. Fixes and support for new hardware lands in new kernels, LTS releases might not have the best support for your newer hardware.
> can be more stable than relying on a distro's fixed release cycles
Stability for a distro means “doesn’t change” not “doesn’t crash”.
Debian/ubuntu are stable because they freeze versions so you can even create scripts to work around bugs and stuff and be sure that it will keep working throughout that entire release.
Arch Linux is not stable because you get updates every day or whatever. Maybe you had some script or patch to work around a bug and tomorrow it won’t work anymore.
This does not say _anything_ about crashing or bugs, except that if you find a bug/crash on a stable system then it is likely you can rely on this behaviour.
Agree. If you use a rolling release you definitely need a strategy for stability. I turn off automatic updates and schedule planned full updates that I can easily roll back from. I've had two breakages over the years that required snapper rollback. (Rolling back from a major distro upgrade isn't that easy)
It's a tradeoff that I'm happy with. I get to have a very up to date system.
That’s interesting comment. I didn’t think about that. I’ve only ever used Ubuntu flavours so I’ll search through what the popular rolling releases are out of interest.
Is this actually such a big point? I feel like (subjectively) on Ubuntu everything gets updated just as fast, and even if not, there's a new full release every 6 months. Or is this actually rather slow in comparison to Feroda?
I've also only used Debian based stuff my whole life and even moving from apt to dnf or whatever it was causes too much friction for me haha, though it's not that bad obviously, if I really would see the positives.
I outfitted our 10 person team with the E16 g2 and it’s been great.
Two minor issues- it’s HEAVY compared to T models.
Because of the weight try not to walk around with the lid up and holding it from one of front corners. I’ve noticed one of them is kind of warped from walking around the office holding it that way.
That’s great news thanks. I got the gen 3 so maybe some improvements. Weight is ok as I really just move it around the house. I buy used Panasonics for the workshop.
Been a kubuntu user since .. 2006? 2007? Don't remember when kubuntu became a thing, but as soon as I tried Ubuntu, I went kubuntu. I believe it was 5.10 or 6.04 or something. :-)
Am growing tired of Ubuntu though. Just not sure where I should turn. I want a .deb based system. Ubuntu is pushing snaps too heavily for my liking.
I was a very long time debian user who got burned by Ubuntu and derivatives far too many times personally and professional. I moved to Fedora a few years back and it was a great decision. No regrets.
I liked Ubuntu and variants back when it first came out and I was newer to Linux but it didn't take long for me to realise there always seemed to be a better option for me as a daily driver. To me its like an new Linux user OS where a lot of stuff is chosen for you to use basically as is. Even the name Kubuntu where the K is for KDE but on other distros you would just choose your DE when you install.
I agree. It feels like combination of peak windows UI with the ease of Ubuntu baked in. Then the little mobile app they have that gives you shared clipboard with iOS is cool.
Performance is still very high so if they don't need the current top tier AMD horsepower, Intel is the way to go. It's also quieter, cooler and doesn't throttle. Not to mention the ability to use SRIOV GPU for running Windows software in a VM.
Also, Lenovo tends to limit HiDPI displays to Intel CPUs, for some ekhm unknown reason.
> I wish there were a Linux machine with the hardware quality of a MacBook
It really depends what you mean by "quality". To me first and foremost quality I look for in a laptop is for it to not break. As I'm a heavy desktop user, my laptop is typically with me on the couch or on vacation. Enter my MacBook Air M1: after 13 months, and sadly no extended warranty, the screen broke for no reason overnight. I literally closed it before going to bed and when I opened the lid the next day: screen broken. Some refer to that phenomenon as the "bendgate".
And every time I see a Mac laptop I can't help but think "slick and good looking but brittle". There's a feeling of brittleness with Mac laptops that you don't have with, say, a Thinkpad.
My absolute best laptop is a MIL-SPEC (I know, I know, there are many different types of military specs) LG Gram. Lighter than a MacBook too. And every single time I demo it to people I take the screen, I bent it left and right. This thing is rock solid.
I happen to have this laptop (not my vid) and look at 34 seconds in the vid:
The guy literally throws my laptop (well, the same) down concrete stairs and the thing still just works fine.
The friend who sold it to me (I bought it used) one day stepped on it when he woke up. No problemo.
To me that is quality: something you can buy used and that is rock solid.
Where are the vids of someone throwing a MacBook Air down the stairs and the thing keeps working?
I'm trading a retina display any day for a display that doesn't break when it accidentally falls on the ground.
Now I love the look and the incredible speed of the MacBook Air laptops (I still have my M1 but has its screen broke, I turned it into a desktop) but I really wish they were not desk queens: we've got desktops for that.
I don't want a laptop that require exceptional care and mad packaging skills when putting it inside a backpack (and which then requires the backpack to be manipulated with extreme care).
So: bring me the raw power and why not the nice look of a MacBook Air, but make it sturdy (really the most important for me) and have it support Linux. That I'd buy.
Notice how much the screen wobbles after opening the laptop, around the one minute mark. That does not happen even with the cheapest Macbook Air, that’s the kind of design quality people refer to.
As for light and sturdy, the Netbook era had it all. A shame the world moved on from that.
Phones simply won't survive a week without an industrial case. Screen projectors last as short as a single day.
The only computers that survived her JerryRigEverything levels of abuse are MacBooks+ who routinely fall off tables, stairs, or simply hands.
One even fell off open 90 degrees and rotationally fell right on the far edge at what would be the maximum torque position; there was massive deformation of the lid aluminum but the lid was still flat, the glass
had no cracks, and the whole thing perfectly functional.
(note: these are the older designs from the first unibody to the last Intel laptop, not the newer Mx ones)
+ Well, except one, which had an entire pint toppled towards and sloshed right upon the screen which had the liquid slide straight into the exhaust vents. There was an audible poof as the screen went black)
I've owned two LG gram laptops. Neither were milspec, but both were really nice. Sure, the screen quality isn't going to win any awards, nor will the speakers, but the light weight, fantastic battery life and snappy performance always get a recommendation from me.
I adore my Linux setup and have switched back to it after using M1 Pro for 3 years.
But through all the Dells, Thinkpads and Asus laptops I've had (~10), none were remotely close to a full package that MBP M1 Pro was.
- Performance - outstanding
- Fan noise - non-existent 99% of the time, cannot compare to any other laptop I had
- Battery - not as amazing as people claim for my usage, but still at least 30% better
- Screen, touchpad, speakers, chassis - all highest tier; some PC laptops do screen (Asus OLED), keyboard and chassis (Thinkpad) better, but nothing groundbreaking...
It's the only laptop I've ever had that gave me a feeling that there is nothing that could come my way, and I wouldn't be able to do on it, without any drama whatsoever.
It's just too bad that I can't run multiple external displays on Asahi...
(For posterity, currently using Asus Zenbook S16, Ryzen HX370, 32GB RAM, OLED screen, was $1700 - looks and feels amazing, screen is great, performance is solid - but I'm driving it hard, so fan noise is constant, battery lasts shorter, and it's just a bit more "drama" than with MBP)
Excellent power efficiency in apple silicon - good battery life and good performance at the same time. The aluminum body is also very rigid and premium feeling, unlike so many creaky bendy pc laptops. Good screen, good speakers.
Aluminum and magnesium non-Apple laptops are just as stiff. There's just a wider spectrum of options, including $200 plastic ARM Chromebooks available.
Yes, this is the true dividing factor for me. The battery life of the new ARM laptops is an astounding upgrade from any device I have ever used.
I've been a reluctant MacBook user for 15 years now thanks to it being the de-facto hardware of tech, but for the first time ever since adopting first the M1 Pro and then an M2 Pro I find myself thinking: I could not possibly justify buying literally any other laptop so long as this standard exists.
Being able to run serious developer workflows silently (full kubernetes clusters, compilers, VSCode, multitudes of corpo office suite products etc), for multiple days at a time on a single charge is baffling. And if I leave it closed for a week at 80% battery, not only does that percentage remain nearly the same when resumed-- it wakes instantly! No hibernation wake time shenanigans. The only class of device which even comes close to being comparable are high end e-ink readers, and an e-ink reader STILL loses on wake time by comparison.
I'm at the point now where I'm desperately in need of an upgrade for my 8 year old personal laptop, but I'm holding off indefinitely until I discover something with a similar level of battery performance that can run Linux. As I understand it, the firmware that supports that insane battery life and specifically the suspend functionality that allows it to draw nearly zero power when closed isn't supported by any Linux distro or I would have already purchased another MacBook for personal use.
I am running Sway on a Gentoo on a 4 years old X1 carbon.
> you will get similar usable battery life of around 6-8 hours
My macbook M3 gives me way more than 6-8 hours, it's simply insane. It literallly lasts for multiple days.
> Generally though, battery life isn't an issue anymore considering fast charging is everywhere.
Not an issue indeed, I got used to always charging my X1 carbon. But then I got an M3 for work, and... well it feels like I don't have to charge it ever :-).
As I said: very much a Linux person, but the M3 battery life is absolutely insane.
I’ve never heard someone describe the aluminum body as bad.. what do you not like about it?
The number one benefit is the Apple Silicon processors, which are incredibly efficient.
Then it’s the trackpad, keyboard and overall build quality for me. Windows laptops often just feel cheap by comparison.
Or they’ll have perplexing design problems, like whatever is going on with Dell laptops these days with the capacitive function row and borderless trackpad.
The keyboard and body are not bad at all - rather, they're best in class, and so is the rest of the hardware. It is a premium hardware experience, and has been since Jony Ive left, which is what makes the software so disappointing.
I believe there are a few all-metal laptops competing in the marketplace but was unaware they were actually better than the apple laptops ... what all aluminum laptops are better and how are they better ?
I just turn off trackpads, I'm not interested in that kind of input device, and any space dedicated to one is wasted to me. I use nibs exclusively (which essentially restricts me to Thinkpads).
My arms rest on the body, the last thing I want is for it to be a material that leeches heat out of my body or that is likely to react with my hands' sweat and oils.
Strawman. Because Apple designed it well. Metal’s not an issue. My legacy 2013 MacBook Air still looks and feels and opens like new.
I was looking at Thinkpad Auras today. There are unaligned jutting design edges all over the thing. From a design perspective, I’ll take the smooth oblong squashed egg.
Every PC laptop I’ve touched feels terrible to hold and carry. And they run Windows, and Linux only okay. Apple
MacBooks are a long mile better than everything else and so I don’t care about upgraded memory — buy enough ram at purchase time and you don’t have to think about it again.
Memory upgrades aren’t priced super well, granted, but I could never buy HP Dell Lenovo ever again. They’re terrible. I’ve had all of them. Ironically the best device I’ve had from the other side was a Surface Laptop. But I don’t do Microsoft anymore. And I don’t want to carry squeaky squishy bendy plastic.
Most of all, I’m never getting on a customer support call with the outsourced vendors that do the support for those companies ever ever ever again. I’ll take a visit to an Apple store every day of the week.
If the Macbook has a bad keyboard (ignoring the Butterfly switches, which aren't on any of the M series machines, which are the ones people actually recommend and praise), then the vast majority of Windows machine have truly atrocious keyboards. I prefer the keyboard on my 2012 Macbook to the newer ones, but it's still better than the Windows machines I can test in local stores.
I prefer the aluminium to the plastic found on most Windows machines. The Framework is made from some aluminium alloy from what I know, and I see that as a good thing.
The soldered RAM sucks, but it's a trade-off I'm willing to make for a touchpad that actually works, a pretty good screen, and battery life that doesn't suck.
> "I never understood why people claim the Macbook is so good."
Apple's good enough for the average consumer, just like a 16-bit home computer back in the day. Everyone who looks for something bespoke/specialized (e. g. certified dual- or multi-OS support, ECC-RAM, right-to-repair, top-class flicker-free displays, size, etc.) looks elsewhere, of course.
>I am very impressed with how smooth and problem-free Asahi Linux is. It is incredibly responsive and feels even smoother than my Arch Linux desktop with a 16 core AMD Ryzen 7945HX and 64GB of RAM.
Hmmm still have issue with the battery in sleep mode on the m1. It drains a lot battery when it is in sleep mode compare to mac sleep mode.
I've been an Asahi user since the early stages of the project when it used Arch. Today, I run Fedora Asahi Remix on a Mac Studio M1 Ultra with the Sway desktop, and it truly has been the perfect Linux workstation in every way.
hey have you ever tried compiling the linux kernel on it? It's often difficult in my experience to find compile benchmarks for Apple Silicon that aren't Xcode
Seems like a crazy hobby to me though! Photography is inconvenient enough without having to make your own mounts and use an sdk to do it! History is filled with inconvenient hobbies though.
I would agree with the sentiment about the lack of good bright screens for lenovo's hacker laptops like the X1 carbon.
Why? Lots of people more or less use their computer as a glorified web browser, with some zoom calls and document editing thrown in for good measure. 256gb seems overkill. My girlfriend is somehow still rocking a 2011 MacBook Air. She mostly just uses it for internet banking and managing her finances. Why would she want more than 256gb?
1Tb m.2 SSD cost 70 USD in summer 2025, and probably much less when bought in bulk as a chip. It doesn't make sense to install anything less than 1Tb in an expensive premium laptop. Or it should be upgradeable.
Apple's pricing is one of the reasons I am not going to buy their laptops. Expensive, and with no upgradeable or replaceable parts. And closed-source OS with telemetry.
> Lots of people more or less use their computer as a glorified web browser
For this purpose they can buy $350 laptop with larger screen.
Because the price tag is quite high to get as much storage as you would 15 years ago for about the same money.
I agree that many people use them as glorified internet machines but even then when they occasionally decide to back up some photos or edit a few videos the 256GB non-upgradable storage quickly becomes a limitation.
Price matters. 256GB is fine on a $500 web browsing laptop, but on a $1000+ one it's just a bad deal in 2025, even ignoring the fact that you cannot upgrade it later (it's soldered in place).
Possibly, but I don't see why those people would buy a new MacBook rather than a used 100$ laptop (which would be both better for their finances but also for the planet...)
Have you ever used windows on a $100 second hand laptop?
Imagine for a second that you don't know much about computers. You buy something crap like that and turn it on. Windows is of course already installed. Along with 18 antivirus programs and who knows what other junk. The computer will run dog slow. Even if you get rid of all the preinstalled programs, it'll run horribly slowly.
My mum has a computer from her work. Its pretty recent - worth way more than $100. It takes about 5-10 seconds for zoom or google chrome to start. And about 15 seconds for outlook to open. Its an utterly horrible experience.
If you can afford it, you'll have a way better experience on a macbook air from the last few years. In comparison, everything starts instantly. The experience is fantastic. Premium, even.
Personally I think its criminal that cheap laptops run modern software so poorly. Its just laziness. There's no reason for the experience to be so horrible. But the world being what it is, there is plenty of reasons to spring for a $1000 macbook air over a $100 second hand windows crapbook if you can afford it. Even if you don't do much with the computer.
> there is plenty of reasons to spring for a $1000 macbook air over a $100 second hand windows crapbook if you can afford it
Plus you can pick up a used M1 MacBook Air for as little as $300 these days. Despite being 5 years old, it'll still smoke anything on the PC side much under a grand, in terms of responsiveness.
I don’t know, I kind of like 10 hrs on battery with normal usage and screen fully lit on a 15” screen while not being bulky. Virtually no contenders in that space.
I think they mean that is 2025, 256GB is unreasonably small. Which is true, Apple wants to up-charge hundreds of dollars just to get to the otherwise standard 1TB drive.
That could be, it totally wasn't clear to me that they meant that was unreasonably small hardware vs that's unreasonable for Asahi to say was the minimum you could run on.
Honestly, I suspect there are a whole lot of people that could be perfectly happy with 256GB storage on an Air. I mean, sure, I got 2TB for my MBP 5 years ago, but my father-in-law is likely never going to need even close to 256GB on his, which is basically being a Chromebook for him.
I just bought a 2TB drive over the holidays and it was $250, so it's not like they're an insignificant percentage of a <$1K laptop.
From a supply perspective, 256GB seems ridiculous because you can get way more capacity for not very much money, and because 256GB is now nowhere close to enough flash chips operating in parallel to reach what is now considered high performance.
But from a demand perspective, there are a lot of PC users for whom 256GB is plenty of capacity and performance. Most computers sold aren't gaming PCs or professional workstations; mainstream consumer storage requirements (aside from gaming) have been nearly stagnant for years due to the popularity of cloud computing and streaming video.
Each controller and subcomponent on the motherboard needs a driver that correctly puts it into low power and sleep states to get battery savings.
Most of those components are proprietary and don't use the standard drivers available in Linux kernel.
So someone needs to go and reverse engineer them, upstream the drivers and pray that Apple doesn't change them in next revision (which they did) or the whole process needs to start again.
In other words: get an actually Linux supported laptop for Linux.
One of my favorite machines was the MacBook Air 11 (2012). This was a pure Intel machine, except for a mediocre Broadcom wireless card. With a few udev rules, I squeezed out the same battery performance from Linux I got from OS X, down to a few minutes of advantage in favor of Linux. And all this despite Safari being a marvel of energy efficiency.
The problem with Linux performance on laptops boils down to i) no energy tweaks by default and ii) poor device drivers due to the lack of manufacturer cooperation. If you pick a machine with well supported hardware and you are diligent with some udev rules, which are quite trivial to write thanks to powertop suggestions, performance can be very good.
I am getting a bit more than 10 hours from a cheap ThinkPad E14 Gen7, with a 64 Wh battery, and light coding use. That's less than a MacBook Air, where I would be getting around 13-14 hours, but it's not bad at all. The difference comes mainly from the cheap screen that is more power consuming and ARMs superior efficiency when idling.
But I prefer not to trade the convenience and openness of x86_64 plus NixOS for a bit more battery range. IMHO, the gap is not sufficiently wide to make a big difference in most usage scenarios.
The need to tweak that deeply just to get “baseline” performance really stings, though, particularly if you’re not already accustomed to having to do that kind of thing.
It’d be a gargantuan project, but there should probably be some kind of centralized, cross-distro repository for power configuration profiles that allows users to rate them with their hardware. Once a profile has been sufficiently user-verified and is well rated, distro installers could then automatically fetch and install the profile as a post-install step, making for a much more seamless and less fiddly experience for users.
> 40% battery for 4hrs of real work is better than pretty much any linux supported laptop I've ever used
Not sure what "real work" is for you, but I regularly get more than 12 hours of battery life on an old Chromebook running a Linux and the usual IDEs/dev tooling (in a Crostini VM). All the drivers just work, sleep has no detectable battery drain. It's not a workstation by any means, but dual core Intel's are great for Python/Go/TypeScript
Out of curiosity, does Google contribute the drivers for Chromebook hardware to Linux upstream or do they keep it for themselves? Could it be that they just choose the hardware that works very well out of box with Linux?
I have no idea if there's upstreaming, but the Chomium OS repo is open source so you could check.
I don't know if that would help the wider Linux laptop community, because Chromebook OEMs can only select from a small list of CPU & chipset hardware combinations blessed by Google
What's the bar here? My Thinkpad X270 gets about 16 hours under Ubuntu with swaywm.
If we really want to get pedantic, its internal battery means the external pack is hot-swappable, so I can actually get several days on a "single charge." Good machine for camping trips.
Yet. Plenty of people have with Intel ones - I’m one of them. My first experience with Linux was on a 2016 MBpro. And inevitably people will do the same with the silicon Macs, likely using Asahi it seems.
It's not inevitable. That's not what that word means.
Intel Macs supported Linux because they used Intel's Linux drivers and supported bog-standard UEFI. There are no preexisting drivers or DeviceTree files published by Apple for Linux. There is no UEFI implimentation, just a proprietary bootloader that can be updated post-hoc to deny booting into third-party OSes.
> Why are some of y'all so hostile to this idea?
I would love for Linux to support as many ARM devices as possible. Unfortunately, it requires continuous effort from the OEM to be viable. I've bought Qualcomm, Rockchip and Broadcom boards before, none of them have been supported for half as long as my x86 machines are. Nevermind how fast ARM architectures become obsolete.
It feels like Apple is really the only hostile party here, and they coincidentally decide whether or not you get to use third-party OSes.
It is inevitable. I guarantee you there will be people who run Linux on their silicon Macs. I don’t know how you could possibly hold a stance that no one ever will.
Apple is very hostile to it. It won’t stop everyone though. It’ll continue to be niche but it’s happening.
It's not inevitable. It's fragile. Go boot up your old iPad; that should be well-studied, right? We ought to know how to boot into Linux on an ARM machine that old, it's only fair.
Except, you can't. The bootloader is the same iBoot process that your Apple Silicon machine uses, with mitigations to prevent unsigned OSes or persistent coldboot. All the Cydia exploits in the world won't put Linux back on the menu for iPhone or iPad users. And the same thing could happen to your Mac with an OTA update.
It is entirely possible for Apple to lock down the devices further. There's no guarantee they won't.
> There is literally an apple-developed way to boot securely into alternative OSs
This is not a good thing! You do not want a proprietary bootloader as your only way to launch Linux, it's not a safe or permanent solution. Apple Silicon could have implemented UEFI like the previous Macs did, but Apple chose to lock users into a bootloader they controlled. This is markedly different from most ARM device bootloaders which aren't changed by an OTA update in another OS partition.
> if only apple is hostile
I did not say that at any point in my comment. Forgeties original claim was that Apple Silicon would eventually have support comparable to Intel Macbooks on Linux. I am telling them point-blank that it is impossible, because Apple and Intel have fundamentally different attitudes towards Linux.
> where is my Xbox/ps/switch/any random Android tablet/million other device's running Linux?
Your Switch and Android tablet is already running the Linux kernel. The past 2 generations of Xbox and PS chipsets have upstream support from AMD in the Linux kernel, so you really only need a working bootloader to get everything working.
Ironically, this does mean that the Nintendo Switch has more comprehensive Linux support than Apple Silicon does.
Sure, the kernel. But you surely know that android abstracts away the drivers, so without the proprietary drivers you are back to square zero - it's not "GNU/Linux".
Apple cannot lockdown the Mac. You can’t have a development machine that is incapable of running arbitrary code. Back when they still did WWDC live they said that software development was the biggest professional bloc of Mac users. I’m certain that these days development is the biggest driver of the expensive Macs. No one has ever made a decent argument as to why Apple would lock down the Mac that would also explain why they haven’t done it yet.
Passivity isn’t hostility. There isn’t any evidence that Apple is considering locking down the Mac. They could have easily done that with the transition to their own silicon but they didn’t despite the endless conspiracy theories.
Apple can lockdown the Mac. You might not think it is likely, but without UEFI there is no path of recourse if Apple decides to update iBoot. How do you launch Asahi if Apple quits reading the EFI from the secure partition?
> They could have easily done that with the transition to their own silicon
They already did, that's what my last comment just outlined. Macs do not ship with UEFI anymore, you are wholly at the mercy of a proprietary bootloader that can be changed at any time.
Because the Apple laptops use exactly the same architecture and bootloader as iPads - if you think they're separate you don't know enough to be part of this debate.
That's an admirable goal, but, depending on the hardware, it can run into that pesky thing called reality.
It's getting very tiresome to hear complaints about things that don't work on Linux, only to find that they're trying to run it on hardware that's poorly supported, and that's something they could have figured out by doing a little research beforehand.
Sometimes old hardware just isn't going to be well-supported by any OS. (Though, of course, with Linux, older hardware is more likely to be supported than bleeding-edge kit.)
This is very true. I've been asked by lots of people "how do I start with Linux" and, despite being 99.9% Linux user for everything everyday, my advice was always:
1. Use VirtualBox. Seriously, it won't look cool, but it will 100% work after maybe 5 mins mucking around with installing guest additions. Also snapshots. Also no messing with WiFi drivers or graphics card drivers or such.
2. Get a used beaten down old Thinkpad that people on Reddit confirm to be working with Linux without any drivers. Then play there. If it breaks, reinstall.
3. If the above didn't make you yet disinterested, THEN dual boot.
Also, if you don't care about GUI, then use the best blessing Microsoft ever created - WSL, and look no further.
I've never gotten along too well with virtualization, but would second the ThinkPad idea, or something similar. Old/cheap machine for tinkering is a good way to ease in, and I think bare metal feels more friendly.
I'd probably recommend against dual booting, but I understand it's controversial. I like to equate it to having two computers, but having to fully power one off to do anything* on the other one. Torrents stop, music collection may be inaccessible depending on how you stored it, familiar programs may not be around anymore. I dual booted for a few years in the past and I found it miserable. People who expected me to reboot to play a game with them didn't seem to understand how big of an ask that really was. Eventually things boiled over and I took the Windows HDD out of that PC entirely. Much more peaceful. (Proton solves that particular issue these days also)
That being said, I've had at least two friends who had a dual boot due to my influence (pushing GNU/Linux) who ended up with some sort of broken Windows install later on and were happy to already have Ubuntu as an emergency backup to keep the machine usable.
*Too old might be a problem these days with major distros not having 32bit ISOs anymore
I went 100% bazzite back in April/May, no windows, and I couldn’t be happier. The pc I built is basically 90% gaming/movies/hanging with friends, 10% browser tasks. Very easy to live this life if you don’t have particular professional needs IMO. When I was doing more freelance editing this really would not have been an option as resolve studio does not work well on Linux.
I've tried this once for IntelliJ to work around slow WSL access for Git repos. Was greeted by missing fonts and broken scaling on the intro screen. Oops. But probably I was just unlucky, it might work well for most.
It's a common use-case for x86 machines that implement UEFI. Taking the iPhone and iPad into account, it is a nonexistent use-case for mobile ARM chipset owners.
I know you may have a particular axe to grind here, but Android devices are not a whole lot more likely to let you boot a vanilla linux distro. Apart from a handful of explicitly linux-compatible smartphones, the boot loaders tend to be pretty locked down, and the drivers all propietrary too
Apple does tons of optimizations for every component to improve battery life.
Asahi Linux, which is reverse engineered, doesn't have the resources to figure out each of those tricks, especially for undocumented proprietary hardware, so it's a "death by a thousand cuts" as each of the various components is always drawing a couple of milliwatts more than on macOS.
Eh it's pretty awful. I get 8 hours, yes, but in Linux, those 8 hours are ticking whether my laptop is sleeping in my bag or on my desk with the lid closed or I'm actively using it. 8 hours of active use is pretty good, but 8 hours in sleep is absolutely dreadful.
Exactly. This myth keeps being perpetuated, for some reason.
I'm typing this from a ThinkPad X1 Carbon Gen 13 running Void Linux, and UPower is reporting 99% battery with ~15h left. I do have TLP installed and running, which is supposed to help. Realistically, I won't get around 15h with my usage patterns, but I do get around 10-12 hours. It's a new laptop with a fresh battery, so that plays a big role as well.
This might not be as good as the battery life on a Macbook, but it's pretty acceptable to me. The upcoming Intel chips also promise to be more power efficient, which should help even more.
For optimal battery life you need to tweak the whole OS stack for the hardware. You need to make sure all the peripherals are set up right to go into the right idle states without causing user-visible latency on wake-up. (Note that often just one peripheral being out of tune here can mess up the whole system's power performance. Also the correct settings here depend on your software stack). You need to make sure that cpufreq and cpuidle governors work nicely with the particular foibles of your platform's CPUs. Ditto for the task scheduler. Then, ditto for a bunch of random userspace code (audio + rendering pipeline for example). The list goes on and on. This work gets done in Android and ChromeOS.
This doesn't match my experience. My previous three laptops (two AMD Lenovo Thinkpads, one Intel Sony VAIO) had essentially the same battery life running Linux as running Windows.
I also have an X13 Gen2 AMD. My idle power consumption is 2.5W to 4W depending on brightness. This ends up in 12h-15h (machine/battery ist 2y old I think).
Hey! I actually wrote a thing to make the Swaybar a little more "complete" (e.g. battery status, currently selected program, clock, inspirational quote from ChatGPT, etc): https://git.sr.ht/~tombert/swaybar3
Not going to claim it will change the world or anything, but this runs perpetually with Sway and according to System Monitor it hovers at a little less than a megabyte of RAM. You can set how often you want things to update, and add as many sections as you'd like, and it's easy to create extra modules if you are so inclined (though not as easy as the Clojure version since I haven't found an implementation of multimethods for Rust that I like as much).
A new Wayland protocol is in the works that should support screen cutout information out of the box: https://phosh.mobi/posts/xdg-cutouts/ Hopefully this will be extended to include color information whenever applicable, so that "hiding" the screen cutout (by coloring the surrounding area deep black) can also be a standard feature and maybe even be active by default.
You can't be serious. Wayland is the opposite of modular, and the concept of an extensible protocol only creates fragmentation.
Every compositor needs to implement the giant core spec, or, realistically, rely on a shared library to implement it for them. Then every compositor can propose and implement arbitrary protocols of their own, which should also be supported by all client applications.
It's insanity. This thing is nearly two decades old, and I still have basic clipboard issues[1]. This esoteric cutouts feature has no chances of seeing stable real-world use in at least a decade from now.
Shh...you're not supposed to mention these things alas you be down voted to death.
I also have tremendous issues with Plasma. Things such as graphics glitching in the alt+tab task switcher or Firefox choking the whole system when opening a single 4k PNG image. This is pre-alpha software... So back to X11 it is. Try again in another decade or two.
YMMV and all, but my experience is that Wayland smoothness varies considerably depending on hardware. On modernish Intel and AMD iGPUs for example I’ve not had much trouble with Wayland whereas my tower with an Nvidia 3000 series card was considerably more troublesome with it.
Generally true, though this particular case is due to a single company deciding to not play ball and generally act in a manner that's hostile to the FOSS world for self-serving reasons (Nvidia).
I don't even think it's even that. These bugs seem like bog standard bugs related to correct sharing of graphics resources between processes and accessing with correct mutual exclusion.Blaming NV is likely just a convenient excuse.
> my tower with an Nvidia 3000 series card was considerably more troublesome with it.
I think you're describing a driver error from before Nvidia really supported Wayland. My 3070 exhibited similar behavior but was fixed with the 555-series drivers.
The Vulkan drivers are still so/so in terms of performance, but the smoothness is now on-par with my Macbook and Intel GNOME machine.
The thing is that I'm not experiencing this clipboard issue on Plasma, but on a fresh installation of Void Linux with niri. There are reports of this issue all over[1][2][3], so it's clearly not an isolated problem. The frustrating thing is that I wouldn't even know which project to report it to. What a clusterfuck.
I can't go back to X11 since the community is deliberately killing it. And relying on a fork maintained by a single person is insane to me.
No one is killing it. No one willing to work on it is a very very different thing, and it's very bad faith and needlessly emotional to attribute malice to a lack of support.
What's in very bad faith is twisting the words of the people who work on these projects[1], and blaming me for echoing them.
It's very clear from their actions[2][3] that they have been actively working to "kill" X11.
There are still people willing to work on it, hence the XLibre fork. The fact that most mainstream distros refuse to carry it is another sign that X11 is in fact being actively "killed".
Far from it. The recent XLibre release[1] has a long list of bugfixes and new features.
Besides, isn't the main complaint from the Wayland folks that X11 is insecure and broken? That means there's still a lot of work to be done. They just refuse to do it.
To be fair, X11 has worked great for me for the past ~20 years, but there are obvious improvements that can be made.
Because one property doesn't guarantee the other. A modular system may imply that it can be extended. An extensible system is not necessarily modular.
Wayland, the protocol, may be extensible, but the implementations of it are monolithic. E.g. I can't use the xdg-shell implementation from KWin on Mutter, and so on. I'm stuck with whatever my compositor and applications support. This is the opposite of modularity.
So all this protocol extensibility creates in practice is fragmentation. When a compositor proposes a new protocol, it's only implemented by itself. Implementations by other compositors can take years, and implementations by client applications decades. This is why it's taken 18 years to get close to anything we can refer to as "stable".
> E.g. I can't use the xdg-shell implementation from KWin on Mutter, and so on.
Why not? It's open-source software. Depending on your architecture you may be able to reuse parts of it.
But as a more flexible choice, there is wlroots.
> and implementations by client applications decades.
Toolkits implement these stuff, so most of the time "support by client application" is a gtk/qt version bump away.
> This is why it's taken 18 years to get close to anything we can refer to as "stable".
Is it really fare to compare the first 10 years of a couple of hobby developers with the current "wide-spread" state of the platform? If it were like today for 18 years and fail to improve, sure, something must be truly problematic. But there were absolutely different phases and uptake of the project so it moved at widely different speeds.
> Why not? It's open-source software. Depending on your architecture you may be able to reuse parts of it.
"The system is not modular, but you can make it so."
What a ridiculous statement.
> But as a more flexible choice, there is wlroots.
Great! How do I use wlroots as a user?
> Toolkits implement these stuff, so most of the time "support by client application" is a gtk/qt version bump away.
Ah, right. Is this why Xwayland exists, because it's so easy to do? So we can tell users that all their applications will continue to work when they switch to Wayland?
> Is it really fare to compare the first 10 years of a couple of hobby developers with the current "wide-spread" state of the platform?
It's not fare, you're right. I'll wait another decade before I voice my concerns again.
You can see the same problem in the XMPP world, with a lot of the extensions implemented only by a few applications. But at least most XMPP extensions are designed to be backwards-compatible with clients that don't support them.
You know what OS doesn’t handle the notch? OSX. It happily throws the system tray icons right back there, with an obscure work around to bring them back. Software quality at Apple these days…
Is there a difference from 2024? Is the M2 still a good choice for Linux? I don’t mind older generations, I’m used to be a bit behind in terms of hardware as a tradeoff for good Linux support.
I used to enjoy the X line of ThinkPads but nowadays I don’t see a point going for them anymore, as the things I appreciated about them are slowly being phased out.
There is no support really for Linux on the M3+, nor should anyone expect the situation to change now that the main devs have moved on.
If you would be happy with a M1/M2 laptop knowing full well that it is a dead end and you will never have another Mac laptop with Linux support (the default assumption at this point), then yes it is a great machine.
How confident are you in this statement? I have no particular knowledge of Asahi. But I do know this narrative emerged about Rust-for-Linux after a couple of high-profile individuals quit.
In that case it was plainly bogus but this was only obvious if you were somewhat adjacent to the relevant community. So now I'm curious if it could be the same thing.
(Hopefully by now it's clear to everyone that R4L is a healthy project, since the official announcement that Rust is no longer "experimental" in the kernel tree).
I know Asahi is a much smaller project than R4L so it's naturally at higher risk of losing momentum.
I would really love Asahi to succeed. I recently bought a Framework and, while I am pretty happy with it in isolation... when I use my partner's M4 Macbook Air I just think... damn. The quality of this thing is head and shoulders above the rest of the field. And it doesn't even cost more than the competition. If you could run Linux on it, it would be completely insane to use anything else.
Someone should create a minimal, nearly-headless macOS distribution (similar to the old hackintosh distros) that bootstraps just enough to manage the machine's hardware, with no UI, and fires up the Apple virtualization framework and a Linux VM, which would own the whole display.
It's similarly bogus here. Early Asahi development tried to upstream as much as possible but ultimately still maintained a gigantic pile of downsteam patches, which wasn't a sustainable model.
Most of current development is focused on reducing that pile to zero to get things into a tractable state again. So things continue to be active, but the progress has become much less visible.
M2 to M3 was a complete architectural change that will require a lot of reverse engineering. As far as I know no one is working on this. The M1/M2 work was a labor of love of largely one dev that has since moved on.
The project is still active and working to upstream the work of these devs. But as far as I know, no NEW reverse engineering is being done. Ergo, it’s a dead end.
The idea that a group of people would spend so much of their time trying to get linux to work on Apple hardware through reverse engineering always seemed absolutely crazy to me. I would never consider buying Apple hardware precisely because it doesn't support linux and the work they put in achieves nothing because the risk will always remain that they will lock the hardware further. Nevermind the fact that they will likely never fully reverse engineer all the components.
It just seems like a completely pointless endeavor... perhaps some people buy into it? why would anyone buy overpriced hardware with partial support that may one day be gone? the enhanced battery life doesn't really hold much appeal to me, and the arm architecture if anything is just another signal to stay away.
The only thing that makes sense to me is that they wanted the achievement on their resume, and in that given recent developments they succeeded?
The hardware isn’t overpriced, it’s best in class. It’s just that that class isn’t what you’re looking for, and as a Linux user it’s not for you, which is valid! But the hardware for what it is is one of the absolute best price to performance ratios on the market right now and I’m tired of people pretending it isn’t. You can get a brand new m4 MacBook Air for under $800 right now, and that’s simply one of the best deals around. For an M2 for asahi Linux? Second hand the prices are even better.
You overlooked the UTM app on the App Store (and open source available too), which wraps Apple Silicon virtualization excellently, or you can use Qemu (which I don't).
I used to use Asahi, but the sleep modes power drain was tedious.
With UTM, I install a latest Fedora ISO (declaring it a "Linux", which exposes the option to skip QEMU and use native Apple Silicon virtualization.
It's fantastic. I mention this only because it's been super useful, way better than Asahi, with minimal effort.
Asahi is all reverse engineering. It’s nothing short of a miracle what has already accomplished, despite, not because of, Apple.
That said some of the prominent developers have left the project. As long as Apple keeps hoarding their designs it’s going to be a struggle, even more so now.
If you care about FOSS operating systems or freedom over your own hardware there isn’t a reason to choose Apple.
To be clear, the work the asahi folks are doing is incredible. I’m ashamed to say sometimes their documentation is better than the internal stuff.
I’ve heard it’s mostly because there wasn’t an m3 Mac mini which is a much easier target for CI since it isn’t a portable. Also, there have been a ton of hardware changes internally between M2 and M3. M4 is a similar leap. More coprocessors, more security features, etc.
For example, PPL was replaced by SPTM and all the exclave magic.
This is what ruffles my jimmies about this whole thing:
> I’m ashamed to say sometimes their documentation is better than the internal stuff.
The reverse engineering is a monumental effort, this Sisyphean task of trying to keep up with never-ending changes to the hardware. Meanwhile, the documentation is just sitting there in Cupertino. An enormous waste of time and effort from some of the most skilled people in the industry. Well, maybe not so much anymore since a bunch of them left.
I really hope this ends up biting Apple in the ass instead of protecting whatever market share they are guarding here.
I strongly support a projects stance that you shouldn't ask when it will be done. But the time between the M1 launch and a good experience was less than the time since M3 I would love to know what is involved.
That's an email from James Calligeros. All this patch says is that the author is Hector Martin (and Sven Peter). The code could have been written a long time ago.
The new project leadership team has prioritized upstreaming the existing work over reverse engineering on newer systems.
> Our priority is kernel upstreaming. Our downstream Linux tree contains over 1000 patches required for Apple Silicon that are not yet in upstream Linux. The upstream kernel moves fast, requiring us to constantly rebase our changes on top of upstream while battling merge conflicts and regressions. Janne, Neal, and marcan have rebased our tree for years, but it is laborious with so many patches. Before adding more, we need to reduce our patch stack to remain sustainable long-term.
> Last time, we announced that the core SMC driver had finally been merged upstream after three long years. Following that success, we have started the process of merging the SMC’s subdevice drivers which integrate all of the SMC’s functionality into the various kernel subsystems. The hwmon driver has already been accepted for 6.19, meaning that the myriad voltage, current, temperature and power sensors controlled by the SMC will be readable using the standard hwmon interfaces. The SMC is also responsible for reading and setting the RTC, and the driver for this function has also been merged for 6.19! The only SMC subdevices left to merge is the driver for the power button and lid switch, which is still on the mailing list, and the battery/power supply management driver, which currently needs some tweaking to deal with changes in the SMC firmware in macOS 26.
Also finally making it upstream are the changes required to support USB3 via the USB-C ports. This too has been a long process, with our approach needing to change significantly from what we had originally developed downstream
Hard disagree : try the UTM app on the App Store (or build it from open source) and you get Apple Silicon native virtualization and super simple installation of Aarch64 linuxes from an iso.
i've been doing this for maybe a year, after frustration with power draw and sleep modes (and dual boot) with Asahi.
it's been great...and Apple silicon is still super efficient, which is why i said hard disagree.
Given the speed of the progress that Apple has made on their hardware (from M1 to M5), I think the project was already doomed since the very beginning. Reverse engineering per-se is a huge talent drain that wastes tremendous amount of man-hour on a closed problem. Also, the strong SW-HW integration of Mac is sophisticate and fragile, that is difficult to analyze and replicate. Nailing all those details is not only time consuming, but also limited in the scope, and never yield anything beyond status quo.
I’m quite glad that those talented guys finally escaped from the pit hole of reverse engineering. It maybe fun and interesting, but its future was already capped by Apple. I wish they find another fashion, hopefully something more original and progressive. Stop chasing and push forward.
Very little progress made this year after high profile departures (Hector Martin, project lead, Asahi Lina and Alyssa Rosenzweig - GPU gurus). Alyssa's departure isn't reflected on Asahi's website yet, but it is in her blog. I believe she also left Valve, which I think was sponsoring some aspects of the Asahi project. So when people say "Asahi hasn't seen any setbacks" be sure to ask them who has stepped in to make up for these losses in both talent and sponsorship.
I have no insight into the Asahi project, but the LKML link goes to an email from James Calligeros containing code written by Hector Martin and Sven Peter. The code may have been written a long time ago.
Without official support, the Asahi team needs to RE a lot of stuffs. I’d expect it to lag behind a couple of generations at least.
I blame Apple on pushing out new models every year. I don’t get why it does that. A M1 is perfectly fine after a few years but Apple treats it like an iPhone. I think one new model every 2-3 years is good enough.
M1 is indeed quite adequate for most, but each generation has brought substantial boosts in performance in single-threaded, multi-threaded, and with the M5 generation in particular GPU-bound tasks. These advancements are required to keep pace with the industry and in a few aspects stay ahead of competitors, plus there exist high end users whose workloads greatly benefit from these performance improvements.
I agree. But Apple doesn’t sell new M1 chip laptops anymore AFAIK. There are some refurbished ones but most likely I need to go into a random store to find one. I only saw M4 and M5 laptops online.
That’s why I don’t like it as a consumer. If they keep producing M1 and M2 I’d assume we can get better prices because the total quantity would be much larger. Sure it is probably better for Apple to move forward quickly though.
In the US, Walmart is still selling the M1 MacBook Air new, for $599 (and has been discounted to $549 or better at times, such as Black Friday).
In general, I don't think it's reasonable to worry that Apple's products aren't thoroughly achieving economies of scale. The less expensive consumer-oriented products are extremely popular, various components are shared across product lines (eg. the same chip being used in Macs and iPads) and across multiple generations (except for the SoC itself, obviously), and Apple rather famously has a well-run supply chain.
From a strategic perspective, it seems likely that Apple's long history of annual iteration on their processors in the iPhone and their now well-established pattern of updating the Mac chips less often but still frequently is part of how Apple's chips have been so successful. Annual(ish) chip updates with small incremental improvements compounds over the years. Compare Apple's past decade of chip progress against Intel's troubled past decade of infrequent technology updates (when you look past the incrementing of the branding), uneven improvements and some outright regressions in important performance metrics.
> That’s why I don’t like it as a consumer. If they keep producing M1 and M2 I’d assume we can get better prices because the total quantity would be much larger.
Why would this be true? An M5 MacBook Air today costs the same as an M1 MacBook Air cost in 2020 or whenever they released it, and is substantially more performant. Your dollar per performance is already better.
If they kept selling the same old stuff, then you spread production across multiple different nodes and the pricing would be inherently worse.
If you want the latest and greatest you can get it. If an M1 is fine you can get a great deal on one and they’re still great machines and supported by Apple.
Among other things, I quite like the blog, especially travelling photos. There’s some ‘old Internet / old blogosphere’ vibe in it.
Not much to add to the topic of having a MacBook Air M2 with Linux. Glad it works well, I’m eyeing an M1 one for myself. Yet my Retina (2014 model) works perfect with Arch Linux (including sleep), so I’m patiently waiting for it to break. And at the same me I hope it would work another decade, so good it is. Battery life is quite great, it’s between 3 to 4 hours with battery being at 50% of capacity. So, I expect the new battery could give me up to 8 hours, which is pretty impressive for me. In reality, I don’t need a session for over an hour or two. The only thing I miss is USB-C charging, as that way I could charge with anything when I have no charger on me. Again, in reality, it’s a pretty rare scenario.
author mentions he paid $750 for a MacBook Air M2 with 16GB while on Amazon a M4 Air with 16GB is usually $750-800. I get it that M4/M3 aren't supported to boot Asahi yet, but still.
I really wanted this to work, and it WAS remarkably good, but palm rejection on the (ginormous) Apple trackpad didn't work at all, rendering the whole thing unusable if you ever typed anything.
That was a month ago, this article is a year old. I'd love to be wrong, but I don't think this problem has been solved.
Yeah what is up with that? When I've tried to look into it I've just been met with statements that palm rejection should pretty much just work, but it absolutely doesn't and accidental inputs are so bad it's unusable without a disable/enable trackpad hotkey.
All Firefox users should switch to librewolf. In the short term it’s for telling Mozilla to go f**, in the long term it’s a browser fork with with really good anti fingerprinting.
Note that librewolf rely on Mozilla tech infra for account synchronization and plugin distribution. If you are truly hostile to this organization, is there another browser you can recommend?
Asahi is awesome! But this is also proves that laptops outside the MacBook realm really need to improve so much. I wish there were a Linux machine with the hardware quality of a MacBook
Agreed. On the computer hardware side:
* x86 chips can surpass the M series cpus in multithreaded performance, but are still lagging in singlethreaded performance and power efficiency
* Qualcomm kinda fumbled the Snapdragon X Elite launch with nonexistent Linux support and shoddy Windows stability, but here's to hoping that they "turn over a new leaf" with the X2.
Actually, some Snapdragon X Elite laptops do run Linux now, but performance is not great as there were some weird regressions and anyway newer chips have caught up [1].
On the build quality side, basically all the PCs are still lagging behind Apple, e.g. yesterday's rant post about the Framework laptop [2] touched on a lot of important points. Of course, there are the Thinkpads, which are still built decently but are quite expensive. Some of the Chinese laptops like the Honor MagicBooks could be attractive and some reddit threads confirm getting Linux working on them, but they are hard to get in the US. That said, at least many non-Apple laptops have decent trackpads and really nice screens nowadays.
[1] https://www.phoronix.com/review/snapdragon-x-elite-linux-eoy...
[2] https://news.ycombinator.com/item?id=46375174
I have no faith in Qualcomm to even make me basic gestures towards the Linux community.
All I want is an easy way to install Linux on one of the numerous Snapdragon laptops. I think the Snapdragon Thinkpad might work, but none of the other really do.
A 400$ Arm laptop with good Linux support would be great, but it's never ever going to happen.
Facts are Linux support has heavily accelerated from both Qualcomm and Linaro on their behalf. Anyone who watches Linux ARM mailing lists can attest that.
Things have definitely changed, a lot.
Hardware has already been out for a year. Outside a custom spin by the ubuntu folks, even last years notebooks arent well supported out of the box on linux. I have a Yoga Slim 7x and I tried the Ubuntu spin out at some point - it required me to first extract the firmware from the Windows partition because Qualcomm had not upstreamed it into linux-firmware. Hard to take Qualcomm seriously when the situation is like this.
Qualcomm _does_ upstream all their firmware, but vendors usually require a firmware binary to be signed with their keys, which are burned into the SoC. As a result you cannot use Qualcomm's vanilla firmware and need to extract the original firmware as provided by the vendor, otherwise it won't load. This is an actual security feature, believe it or not. Besides, chances are it wasn't even Qualcomm's firmware, but rather Cirrus for sound or display firmware, etc.
I get the hate on Qualcomm, but you're really one LLM question away from understanding why they do this. I should know, I was also getting frustrated before I read up on this.
I get where youre coming from but I think the job of a company pushing a platform is to make it "boring". ie it should work out of the box on debian/fedora/arch/ubuntu. The platform vendor (Qualcomm) is the only one with enough sway to push the different laptop manufacturers do the right thing. This is the reason why both Intel / Windows push compliance suites which have a long list of requirmements before anyone can put the Windows / Intel logo on their device. If Qualcomm is going to let Acer / Lenovo decide if things work out of the box on linux then its never going to happen.
Fantastic.
Can you please let me know if there is an ISO to get any mainstream Linux distro working on this Snapdragon laptop ?
ASUS - Vivobook 14 14" FHD+ Laptop - Copilot+ PC - Snapdragon X
It's on sale for $350 at Best buy and if I can get Linux working on it it would definitely be an awesome gift for myself.
Even if there's some progress being made, it's still nearly impossible to install a typical Linux distro on one of these. I've been watching this space since the snapdragon laptops were announced. Tuxedo giving up and canceling their Snapdragon Linux laptop doesn't instill much confidence
There's an Ubuntu release specifically targeting new Qualcomm Elite based laptops: https://discourse.ubuntu.com/t/ubuntu-concept-snapdragon-x-e...
This includes Vivobook S15, not sure about the 14.
That covers the Elite, not the cheaper Snapdragon X laptops such as the ASUS Vivobook 14 (X1407QA).
I've followed that thread for almost a year. It's a maze of hardware issues and poor compatibility.
From your other response.
>but vendors usually require a firmware binary to be signed with their keys, which are burned into the SoC. As a result you cannot use Qualcomm's vanilla firmware and need to extract the original firmware as provided by the vendor, otherwise it won't load.
This makes the install process impossible without an existing Windows install. It's easier to say it doesn't work and move on.
It's going to be significantly easier to buy run Linux in an X86 laptop.
Not to mention no out of the box Linux Snapdragon Elite laptop exists. It's a shame because it would probably be an amazing product.
This sounds a lot like how AMD’s approach had changed on Linux and still everyone I know who wants to use their GPU fully used Nvidia. For a decade or more I’ve heard how AMD has turned over a new leaf and their drivers are so much better. Even geohot was going to undercut nvidia by just selling tinygrad boxes on AMD.
Then it turned out this was the usual. Nothing had changed. It was just that people online have this desire to express that “the underdog” is actually better. Not clear why because it’s never true.
AMD is still hot garbage on Linux. Geohot primarily sells “green boxes”. And the MI300x didn’t replace H100s en masse.
Maybe it's just that you're mostly viewing this through the LLM lens?
I remember having to fight with fglrx, AMDs proprietary Linux driver, for hours on end. Just to get hardware-acceleration for my desktop going! That driver was so unbearable I bought Nvidia just because I wanted their proprietary driver. Cut the fiddling time from many hours to maybe 1 or 2!
Nowadays, I run AMD because their open-source amdgpu driver means I just plonk the card into the system, and that's it. I've had to fiddle with the driver exactly zero times. The last time I used Nvidia is the distant past for me. So - for me, their drivers are indeed "so much better". But my usecase is sysadmin work and occasional gaming through Steam / Proton. I ran LMStudio through ROCm, too, a few times. Worked fine, but I guess that's very much not representative for whatever people do with MI300 / H100.
> and occasional gaming through Steam / Proton
And how does that work on AMD? I know the Steam Deck is AMD but Valve could have tweaked the driver or proton for that particular GPU.
I play lots of games on a AMD GPU (RX 7600) for about a year and I can't remember a game that had graphical issues (eg driver bugs).
Probably something hasn't run at some point but I can't remember what, more likely to be a Proton "issue". Your main problem will be some configuration of anti-cheat for some games.
My experience has been basically fantastic and no stress. Just check that games aren't installing some Linux build which are inevitably extremely out of date and probably wont run. Ex: human fall flat (very old, wont run), deus ex mankind divided (can't recall why but I elected to install the proton version, I think performance was poor or mouse control was funky).
I guess I don't play super-new games so YMMV there. Quick stuff I can recall, NMS, Dark Souls 1&2&3, Sekiro, Deep Rock Galactic, Halo MCC, Snow runner & Expeditions, Eurotruck, RDR1 (afaik 2 runs fine, just not got it yet), hard space ship breaker, vrising, Tombraider remaster (the first one and the new one), pacific drive, factorio, blue prince, ball x pit, dishonored uhhh - basically any kind of "small game" you could think of: exapunks, balatro, slay the spire, gwent rougemage, whatever. I know there were a bunch more I have forgotten that I played this year.
I actually can't think of a game that didn't work... Oh this is on Arch Linux, I imagine Debian etc would have issues with older Mesa, etc.
Works very well for me! YMMV maybe depending on the titles you play, but that would probably be more of a Proton issue than an AMD issue, I'd guess. I'm not a huge gamer, so take my experience with a grain of salt. But I've racked up almost 300 hours of Witcher3 with the HQ patch on a 4k TV display using my self-compiled Gentoo kernel, and it worked totally fine. A few other games, too. So there's that!
Don’t know what LLM lens is. I had an ATI card. Miserable. Fglrx awful. I’ve tried various AMDs over the last 15 years. All total garbage compared to nvidia. Throughout this period was consistently informed of new OSS drivers blah blah. Linus says “fuck nvidia”. AMD still rubbish.
Finally, now I have 6x4090 on one machine. Just works. 1x5090 on other. Just works. And everyone I know prefers N to A. Drivers proprietary. Result great. GPU responds well.
Google has previously delivered good Linux support on Arm Chromebooks and is expected to launch unified Android+ChromeOS on Qualcomm X2 Arm devices in 2026.
Isn't Google moving to Fuchsia?
I don't think these are mutually exclusive, they're just unifying ChromeOS and Android for now.
Fuchsia is dead sadly
I thought it was the opposite and that it would replace Linux for Google products
It's very alive. It's being used for Google Nest hub devices. Through for HN that might as well it being dead, it seems.
“Rumors of my death are greatly exaggerated”
Google folks pop up here and there and say it’s actively worked on. Unless you have more recent information, I believe the project is still alive.
Citation needed? I don’t disbelieve you but I haven’t seen anything concrete.
On bare metal or pKVM?
> x86 chips can surpass the M series cpus in multithreaded performance, but are still lagging in singlethreaded performance
Nodding along with the rest but isn't this backwards? Are M series actually outperforming an Intel i9 P-core or Ryzen 9X in raw single-threaded performance?
Not in raw performance, no, but they're only beat out by i9s and the like, which are very power hungry. If you care even a little bit about performance per watt, the M series are far superior.
Have a look at Geekbench's results.[1] Ignore the top ones, since they're invalid and almost certainly cheated (click to check). The iPads and such lower down are all legit, but the same goes for some of the i9s inbetween.
And honestly, the fact that you have to go up to power hungry desktop processors to even find something to compete with the chip that goes in an (admittedly high-end) iPad, is somewhat embarrassing on its face, and not for Apple.
https://browser.geekbench.com/v6/cpu/singlecore
Yes, the M4 is still outperforming the desktop 9950X in single-threaded performance on several benchmarks like Geekbench and Cinebench 2024 [1]. Compared to the 9955HX, which is the same physical chip as the 9950X but lower clocked for mobile, the difference is slightly larger. But the 16 core 9950X is obviously much better than the base M4 (and even the 16 core M4 Max, which has only 12 P cores and 4 E cores) at multithreaded applications.
However, the M2 in the blog post is from 2022 and isn't quite as blazingly fast in single thread performance.
[1] https://nanoreview.net/en/cpu-compare/apple-m4-8-cores-vs-am...
Does an i9 P-core or Ryzen 9X run on 3.9 W while posting on HN?
That's irrelevant to that claim being true or not. The fact that M series win in power efficiency is already addressed.
Dealing with Honor support is a pain. They don't understand absolutely anything and is impossible to get them out of their script if you have a problem.
I have a Honor 200 pro, and the software is buggy and constantly replaces user configurations with their defaults every 3 or 4 days.
I would avoid anything Honor in the future at any cost.
Honor strangely enough doesnt make any efforts to really support Linux
The machine quality is pretty damn good, but Huawei machines are still better. Apple level of quality. And Huawei releases their machines with Linux preinstalled
The company to watch is Wiko. Its their French spin off to sidestep their chip ban. They might put out some very nice laptops, but a bit tbd
The closest laptop to MacBook quality is surprisingly the Microsoft Surface Laptop.
As to x86, Zen 6 will be AMD's first major architecture rework since Apple demonstrated what is possible with wide decode. ( Well more accurately it should be since the world take notice because it happened long before M1 ). It still likely wont be close to M5 or even M4 with Single Threaded Performance / Watt, but hopefully it will be close.
I bought a refurb gen 4 thinkpad on amazon for like $350 and it arrived almost brand new.
Installed arch, setup some commands to underclock the processor on login and easily boost it when I'm compiling.
Battery life is great but I'm not running a GUI either. Good machine for when I want to avoid distractions and just code.
My personal beef with Thinkpads is the screen. Most of the thinkpads I’ve encountered in my life (usually pretty expensive corporate ones) had shitty FHD screens. I got too spoiled by retina screens, and I can’t comfortably use anything with lower DPI.
FWIW if you buy new from Lenovo, getting a more high-res display has been an option for years.
I'm on the other side where I've been buying Thinkpads partly because of the display. Thinkpads have for a long time been one of the few laptop options on the market where you could get a decent matte non-glare display. I value that, battery life and performance above moar pixels. Sure I want just one step above FHD so I can remote 1080p VMs and view vids in less than fullscreen at native resolution but 4K on a 14" is absolute overkill.
I think most legit motivations for wanting very high-res screens (e.g. photo and video editing, publishing, graphics design) also come with wanting or needing better quality and colors etc too, which makes very-highly-scaled mid-range monitors a pretty niche market.
> I got too spoiled by retina screens, and I can’t comfortably use anything with lower DPI.
Did you make a serious effort while having an extended break from retina screens? I'd think you would get used to it pretty quickly if you allow yourself to readjust. Many people do multi-DPI setups without issues - a 720p and a 4k side-by-side for example. It just takes acclimatizing.
I have a 14” FHD panel (158 dpi) on an old (7 year) laptop and there’s more issues with low resolution icons and paddings than with font rendering. I wouldn’t mind more, but it’s not blurry.
I just learned on Reddit the other day that people replace those screens with third party panels, bought from AliExpress for peanuts. They use panelook.com to find a compatible one.
If you buy a X1 from Lenovo the screen is definitely going to be better. if not, you can simply change the screen from most of the other models.
Old Thinkpads are great! I used to have a Lenovo Thinkpad X1 Carbon Gen 6 with Intel Core i7 8640U, 16 GB of RAM, and 1 TB SSD. I installed Arch Linux on it with Sway.
> On the build quality side, basically all the PCs are still lagging behind Apple,
This is an oft-repeated meme, but not really true. Thinkpads, high-end lightweight gaming laptops like the Asus G14... There are many x86 laptops with excellent build quality.
Check out Ubuntu Certified hardware[1].
I've moved completely to EliteBooks and am very happy with my decision. The build quality is superb, they're upgradeable, everything is replaceable and there's an excellent market and after market for parts, and HP has codepaths in their firmware for Linux support, meaning even Modern Standby works well.
Price points for refurb and used hardware are great, too.
[1] https://ubuntu.com/certified
But they’re heavier, slower, have more impactful active cooling, have much worse battery life (mostly due to the processor), and have some lower quality user interface components. Don’t get me wrong they’re decent hardware! It’s just the macbook air benchmark is very high.
The key qualities of something like a macbook air are:
It has no fans.
It's temperature never changes unless you really push it. I've never used any other laptop where I could feel at least some warmth when it was turned on.
My m1 air still has enough battery to run for a full day of usage, here several years after I bought it. Basically never loses power while the lid is closed either, but that is less of an issue.
Looking at a Thinkpad 16" P1 Gen 8 with 2X 1TB SSD, 64GB RAM, QHD+ screen, centered keyboard like MBP (i.e. no numpad), integrated Intel GPU, lightweight (4 lbs) for a little under $2.5K USD.
Closest I've found to an MBP 16" replacement.
Have been running Dell Precision laptops for many years on Linux, not sure about Lenovo build quality and battery life, but hoping it will be decent enough.
Would run Asahi if it supported M4 but looks it's a long ways away...
How is battery life? I still use MacBooks only because of that
Does lid close to sleep and open to wake work as expected?
I'm using T14s Gen 4 Intel and sleep works for me. I'm using it in clamshell mode connected to external display 99% of the time, so I don't really use sleep all the time, but the few times I tested it, it worked. Actually every hardware peripheral, including fingerprint sensor, worked out of the box. I was pleasantly surprised by that kind of support.
I've got a relatively new p16s with a hybrid Nvidia/Intel GPU, and a p14s gen 5 with an AMD GPU, and I was able to get both of them to suspend by closing the lid. Not sure if the issue you speak of is unique to the P1 or not, but all my ThinkPads have been decent with Linux.
Thanks.
I’ve had issues with T14s for a couple of gens where the machine wakes up during the closed lid and runs the battery down. I’ve tried the usual troubleshooting.
This has been a non issue on Dell machines for almost 20 years.
Somewhat related yet not. I had a Dell laptop near kill itself waking up while in my backpack and near melting itself. I think I blame Windows update for this though. This resulted in the laptop not being able to power on most of the times after that.
That's possibly because they had S3 disabled and used 'modern standby': https://learn.microsoft.com/en-us/windows-hardware/design/de...
Oh some kernel params and other settings can help with that. These are mine, and it's been working great:
Kernel params
Other settings (executed with a systemd service) (also only needed on p16s, not on my p14s)I am giving my MacBook Air M2 15” to my wife and bought a Lenovo E16 with 120hz screen to run Kubuntu last night. She needed a new laptop and I am had enough of macOS and just need some stuff to work that will be easier on an intel and Linux. Also I do bookwork online so bigger screen and dedicated numpad will be nice. It reviews well and seems like good value for money with current holiday sales but I don’t expect the same hardware quality or portability just a little more freedom. I hope I’m not too disappointed. https://www.notebookcheck.net/Lenovo-ThinkPad-E16-G3-Review-...
If you're running desktop Linux, you will have a better experience with a rolling release than being stuck with whatever state the software that was frozen in Debian/Ubuntu is in, especially when it comes to multimedia, graphics, screen sharing, etc.
Modern desktop Linux relies on software that's being fixed and improving at a high velocity, and ironically, can be more stable than relying on a distro's fixed release cycles.
KDE Plasma, Wayland support, Pipewire, etc all have had recent fixes and improvements that you will not get to enjoy for another X months/years until Canonical pulls in those changes and freezes them for release.
Similarly, newer kernels are a must when using relatively recent hardware. Fixes and support for new hardware lands in new kernels, LTS releases might not have the best support for your newer hardware.
> can be more stable than relying on a distro's fixed release cycles
Stability for a distro means “doesn’t change” not “doesn’t crash”.
Debian/ubuntu are stable because they freeze versions so you can even create scripts to work around bugs and stuff and be sure that it will keep working throughout that entire release.
Arch Linux is not stable because you get updates every day or whatever. Maybe you had some script or patch to work around a bug and tomorrow it won’t work anymore.
This does not say _anything_ about crashing or bugs, except that if you find a bug/crash on a stable system then it is likely you can rely on this behaviour.
Agree. If you use a rolling release you definitely need a strategy for stability. I turn off automatic updates and schedule planned full updates that I can easily roll back from. I've had two breakages over the years that required snapper rollback. (Rolling back from a major distro upgrade isn't that easy)
It's a tradeoff that I'm happy with. I get to have a very up to date system.
I just upgrade Ubuntu every 6 months. To me that's a pretty good compromise between up to date packages and stability.
> you will have a better experience with a rolling release than being stuck with whatever state the software that was frozen in Debian/Ubuntu is in
That's a wild statement!
That’s interesting comment. I didn’t think about that. I’ve only ever used Ubuntu flavours so I’ll search through what the popular rolling releases are out of interest.
I would recommend Fedora KDE Edition over Kubuntu, but I guess it's a personal choice.
I’ve only ever used Ubuntu flavours but maybe I should give it a try. Thanks
Fedora also has ThinkPad compability program and a nice way to install/update Lenovo drivers.
The problem with Ubuntu, as other mentioned, is that you get ancient version of some packages. Fedora is nicely up to date.
Is this actually such a big point? I feel like (subjectively) on Ubuntu everything gets updated just as fast, and even if not, there's a new full release every 6 months. Or is this actually rather slow in comparison to Feroda?
I've also only used Debian based stuff my whole life and even moving from apt to dnf or whatever it was causes too much friction for me haha, though it's not that bad obviously, if I really would see the positives.
I outfitted our 10 person team with the E16 g2 and it’s been great.
Two minor issues- it’s HEAVY compared to T models.
Because of the weight try not to walk around with the lid up and holding it from one of front corners. I’ve noticed one of them is kind of warped from walking around the office holding it that way.
> I’ve noticed one of them is kind of warped from walking around the office holding it that way.
That’s not at all reassuring.
That’s great news thanks. I got the gen 3 so maybe some improvements. Weight is ok as I really just move it around the house. I buy used Panasonics for the workshop.
Are you running windows?
Kubuntu is nice. Not sure why it's not more popular. Or maybe it's just a quieter user base?
Been a kubuntu user since .. 2006? 2007? Don't remember when kubuntu became a thing, but as soon as I tried Ubuntu, I went kubuntu. I believe it was 5.10 or 6.04 or something. :-)
Am growing tired of Ubuntu though. Just not sure where I should turn. I want a .deb based system. Ubuntu is pushing snaps too heavily for my liking.
Just use Debian and switch it to Testing. Works amazingly well and you'll always have relatively current and generally stable software.
So, Debian? No snaps and that’s my main motivation
I was a very long time debian user who got burned by Ubuntu and derivatives far too many times personally and professional. I moved to Fedora a few years back and it was a great decision. No regrets.
I liked Ubuntu and variants back when it first came out and I was newer to Linux but it didn't take long for me to realise there always seemed to be a better option for me as a daily driver. To me its like an new Linux user OS where a lot of stuff is chosen for you to use basically as is. Even the name Kubuntu where the K is for KDE but on other distros you would just choose your DE when you install.
I agree. It feels like combination of peak windows UI with the ease of Ubuntu baked in. Then the little mobile app they have that gives you shared clipboard with iOS is cool.
Also consider Kinoite, the immutable Fedora KDE (like Silverblue). Very effective and robust.
If I was you I will have gone for the T or X series
Why?
E is more like a budget line of ThinkPads, at least compared to X and T.
Starlabs are good quality Linux laptops, designed in house. Love my starbook
Never used MacBooks, but Lenovo Thinkpad laptops with Linux are really good in my experience. Get anything recent with AMD.
The best recent experience is arguably with current Intel chips, actually, because of the battery usage that can reach 20 hours, easily matching Macbooks: https://www.notebookcheck.net/Intel-empire-strikes-back-with...
Performance is still very high so if they don't need the current top tier AMD horsepower, Intel is the way to go. It's also quieter, cooler and doesn't throttle. Not to mention the ability to use SRIOV GPU for running Windows software in a VM.
Also, Lenovo tends to limit HiDPI displays to Intel CPUs, for some ekhm unknown reason.
With current situation in Intel (lot's of Linux developers leaving), I'd stick to AMD anyway.
> I wish there were a Linux machine with the hardware quality of a MacBook
It really depends what you mean by "quality". To me first and foremost quality I look for in a laptop is for it to not break. As I'm a heavy desktop user, my laptop is typically with me on the couch or on vacation. Enter my MacBook Air M1: after 13 months, and sadly no extended warranty, the screen broke for no reason overnight. I literally closed it before going to bed and when I opened the lid the next day: screen broken. Some refer to that phenomenon as the "bendgate".
And every time I see a Mac laptop I can't help but think "slick and good looking but brittle". There's a feeling of brittleness with Mac laptops that you don't have with, say, a Thinkpad.
My absolute best laptop is a MIL-SPEC (I know, I know, there are many different types of military specs) LG Gram. Lighter than a MacBook too. And every single time I demo it to people I take the screen, I bent it left and right. This thing is rock solid.
I happen to have this laptop (not my vid) and look at 34 seconds in the vid:
https://youtu.be/herYV5TJ_m8
The guy literally throws my laptop (well, the same) down concrete stairs and the thing still just works fine.
The friend who sold it to me (I bought it used) one day stepped on it when he woke up. No problemo.
To me that is quality: something you can buy used and that is rock solid.
Where are the vids of someone throwing a MacBook Air down the stairs and the thing keeps working?
I'm trading a retina display any day for a display that doesn't break when it accidentally falls on the ground.
Now I love the look and the incredible speed of the MacBook Air laptops (I still have my M1 but has its screen broke, I turned it into a desktop) but I really wish they were not desk queens: we've got desktops for that.
I don't want a laptop that require exceptional care and mad packaging skills when putting it inside a backpack (and which then requires the backpack to be manipulated with extreme care).
So: bring me the raw power and why not the nice look of a MacBook Air, but make it sturdy (really the most important for me) and have it support Linux. That I'd buy.
> Where are the vids of someone throwing a MacBook Air down the stairs and the thing keeps working?
For some anecdata, I have:
Stood on mine Poured water on it. Been hit by a car while cycling and fallen on it. Dropped it.
It’s fine. Has a few scratches and a small dent. The predecessor is a 2013 Air which has had a hard life. It’s going great.
A colleague put a piece of a4 paper between keyboard and screen then closed it, squeezed it and cracked the screen. Don’t do that.
Notice how much the screen wobbles after opening the laptop, around the one minute mark. That does not happen even with the cheapest Macbook Air, that’s the kind of design quality people refer to.
As for light and sturdy, the Netbook era had it all. A shame the world moved on from that.
Counter anecdata.
My wife is the bane of electronic devices.
Phones simply won't survive a week without an industrial case. Screen projectors last as short as a single day.
The only computers that survived her JerryRigEverything levels of abuse are MacBooks+ who routinely fall off tables, stairs, or simply hands.
One even fell off open 90 degrees and rotationally fell right on the far edge at what would be the maximum torque position; there was massive deformation of the lid aluminum but the lid was still flat, the glass had no cracks, and the whole thing perfectly functional.
(note: these are the older designs from the first unibody to the last Intel laptop, not the newer Mx ones)
+ Well, except one, which had an entire pint toppled towards and sloshed right upon the screen which had the liquid slide straight into the exhaust vents. There was an audible poof as the screen went black)
I've owned two LG gram laptops. Neither were milspec, but both were really nice. Sure, the screen quality isn't going to win any awards, nor will the speakers, but the light weight, fantastic battery life and snappy performance always get a recommendation from me.
We almost had really nice arm laptops, but they got super greedy about it having AI and no one wanted them.
ARM is a capricious licensor. It's hardly surprising.
I never understood why people claim the Macbook is so good.
Bad keyboard, bad aluminium body, soldered ram...
Is it just the Apple Silicon that somehow makes it worth it? It's ARM, most software is still written and optimized for x86.
I adore my Linux setup and have switched back to it after using M1 Pro for 3 years.
But through all the Dells, Thinkpads and Asus laptops I've had (~10), none were remotely close to a full package that MBP M1 Pro was.
- Performance - outstanding
- Fan noise - non-existent 99% of the time, cannot compare to any other laptop I had
- Battery - not as amazing as people claim for my usage, but still at least 30% better
- Screen, touchpad, speakers, chassis - all highest tier; some PC laptops do screen (Asus OLED), keyboard and chassis (Thinkpad) better, but nothing groundbreaking...
It's the only laptop I've ever had that gave me a feeling that there is nothing that could come my way, and I wouldn't be able to do on it, without any drama whatsoever.
It's just too bad that I can't run multiple external displays on Asahi...
(For posterity, currently using Asus Zenbook S16, Ryzen HX370, 32GB RAM, OLED screen, was $1700 - looks and feels amazing, screen is great, performance is solid - but I'm driving it hard, so fan noise is constant, battery lasts shorter, and it's just a bit more "drama" than with MBP)
iirc M1 just cannot do multiple displays at all :-(
A modern M4 should tho
Excellent power efficiency in apple silicon - good battery life and good performance at the same time. The aluminum body is also very rigid and premium feeling, unlike so many creaky bendy pc laptops. Good screen, good speakers.
Aluminum and magnesium non-Apple laptops are just as stiff. There's just a wider spectrum of options, including $200 plastic ARM Chromebooks available.
Do you have any examples? The top-of-the-line Surface laptops are still comparatively flimsy, same for Samsung and Vaio. What’s better?
> Is it just the Apple Silicon that somehow makes it worth it? It's ARM, most software is still written and optimized for x86.
I am very much a Linux person. But the battery life with macOS on the Apple Silicon is absolutely insane.
Yes, this is the true dividing factor for me. The battery life of the new ARM laptops is an astounding upgrade from any device I have ever used.
I've been a reluctant MacBook user for 15 years now thanks to it being the de-facto hardware of tech, but for the first time ever since adopting first the M1 Pro and then an M2 Pro I find myself thinking: I could not possibly justify buying literally any other laptop so long as this standard exists.
Being able to run serious developer workflows silently (full kubernetes clusters, compilers, VSCode, multitudes of corpo office suite products etc), for multiple days at a time on a single charge is baffling. And if I leave it closed for a week at 80% battery, not only does that percentage remain nearly the same when resumed-- it wakes instantly! No hibernation wake time shenanigans. The only class of device which even comes close to being comparable are high end e-ink readers, and an e-ink reader STILL loses on wake time by comparison.
I'm at the point now where I'm desperately in need of an upgrade for my 8 year old personal laptop, but I'm holding off indefinitely until I discover something with a similar level of battery performance that can run Linux. As I understand it, the firmware that supports that insane battery life and specifically the suspend functionality that allows it to draw nearly zero power when closed isn't supported by any Linux distro or I would have already purchased another MacBook for personal use.
>But the battery life with macOS on the Apple Silicon is absolutely insane.
Run a lightweight DE like i3wm with any modern thinkpad and you will get similar usable battery life of around 6-8 hours.
Generally though, battery life isn't an issue anymore considering fast charging is everywhere.
I am running Sway on a Gentoo on a 4 years old X1 carbon.
> you will get similar usable battery life of around 6-8 hours
My macbook M3 gives me way more than 6-8 hours, it's simply insane. It literallly lasts for multiple days.
> Generally though, battery life isn't an issue anymore considering fast charging is everywhere.
Not an issue indeed, I got used to always charging my X1 carbon. But then I got an M3 for work, and... well it feels like I don't have to charge it ever :-).
As I said: very much a Linux person, but the M3 battery life is absolutely insane.
I’ve never heard someone describe the aluminum body as bad.. what do you not like about it?
The number one benefit is the Apple Silicon processors, which are incredibly efficient.
Then it’s the trackpad, keyboard and overall build quality for me. Windows laptops often just feel cheap by comparison.
Or they’ll have perplexing design problems, like whatever is going on with Dell laptops these days with the capacitive function row and borderless trackpad.
The keyboard and body are not bad at all - rather, they're best in class, and so is the rest of the hardware. It is a premium hardware experience, and has been since Jony Ive left, which is what makes the software so disappointing.
"... bad aluminium body ..."
Would you elaborate ?
I believe there are a few all-metal laptops competing in the marketplace but was unaware they were actually better than the apple laptops ... what all aluminum laptops are better and how are they better ?
I know multiple people with Macbook contact phobia from the static charge the chassis builds up.
This is trivially solvable by using the 3-prong plug on the power adapter, btw. That grounds the laptop properly, no more charge build-up.
(unfortunately in the EU they only provide the 3-prong plug in the long-tail variant, which is kind of a bummer)
why would you want a laptop being made of metal?
it's a stylistic choice, not a logical one.
Because the body is the heat sink, so it has no fan.
That alone is already very compelling for me (no noise, no fan to wear out). Then on top of that it has:
* Amazing battery life
* Great performance
* The best trackpad in the world
* Bright, crisp screen
The only downsides are the lack of upgradability and the annoying OS, but at least it's UNIX.
I just turn off trackpads, I'm not interested in that kind of input device, and any space dedicated to one is wasted to me. I use nibs exclusively (which essentially restricts me to Thinkpads).
My arms rest on the body, the last thing I want is for it to be a material that leeches heat out of my body or that is likely to react with my hands' sweat and oils.
> and the annoying OS
"...It's just a flesh wound..."
It feels great and it's recyclable.
Strawman. Because Apple designed it well. Metal’s not an issue. My legacy 2013 MacBook Air still looks and feels and opens like new.
I was looking at Thinkpad Auras today. There are unaligned jutting design edges all over the thing. From a design perspective, I’ll take the smooth oblong squashed egg.
Every PC laptop I’ve touched feels terrible to hold and carry. And they run Windows, and Linux only okay. Apple MacBooks are a long mile better than everything else and so I don’t care about upgraded memory — buy enough ram at purchase time and you don’t have to think about it again.
Memory upgrades aren’t priced super well, granted, but I could never buy HP Dell Lenovo ever again. They’re terrible. I’ve had all of them. Ironically the best device I’ve had from the other side was a Surface Laptop. But I don’t do Microsoft anymore. And I don’t want to carry squeaky squishy bendy plastic.
Most of all, I’m never getting on a customer support call with the outsourced vendors that do the support for those companies ever ever ever again. I’ll take a visit to an Apple store every day of the week.
I've had 4 Lenovos and out of the Asus, Dell, HP, Panasonic, and Sony laptops I've had, they always seem to have excellent Linux support.
My team is going through a lot of pain right now with new Lenovo Aura laptops. But I haven’t had a chance to Linux-ify them.
T series or x13 in particular.
Not sure about anything else, have ONLY used those.
Feel like these critiques are 10 years old.
I'd understand this about the 2016 MacBooks with the butterfly switch keyboards. I don't understand this in 2025.
the screen is very good, the trackpad is very good, the screen does not wobble or bend - it is sturdy. and it is quiet!
Rarely mentioned is the audio, the Mac's bass and overall sound is much better than any other laptop its size.
Right now I’m sitting in front of a hotel tv with speakers that are so crap that we are putting the sound through the MacBook to improve things.
If the Macbook has a bad keyboard (ignoring the Butterfly switches, which aren't on any of the M series machines, which are the ones people actually recommend and praise), then the vast majority of Windows machine have truly atrocious keyboards. I prefer the keyboard on my 2012 Macbook to the newer ones, but it's still better than the Windows machines I can test in local stores.
I prefer the aluminium to the plastic found on most Windows machines. The Framework is made from some aluminium alloy from what I know, and I see that as a good thing.
The soldered RAM sucks, but it's a trade-off I'm willing to make for a touchpad that actually works, a pretty good screen, and battery life that doesn't suck.
> "I never understood why people claim the Macbook is so good."
Apple's good enough for the average consumer, just like a 16-bit home computer back in the day. Everyone who looks for something bespoke/specialized (e. g. certified dual- or multi-OS support, ECC-RAM, right-to-repair, top-class flicker-free displays, size, etc.) looks elsewhere, of course.
It sounds like you have not tried a M series laptop in the last 3 years. Shrug.
Last 5.
>I am very impressed with how smooth and problem-free Asahi Linux is. It is incredibly responsive and feels even smoother than my Arch Linux desktop with a 16 core AMD Ryzen 7945HX and 64GB of RAM.
Hmmm still have issue with the battery in sleep mode on the m1. It drains a lot battery when it is in sleep mode compare to mac sleep mode.
At the bottom he also says.
> higher battery drainage during sleep, so I usually just shut it down entirely when not using it
> no hardware acceleration for video decoding
> some USB port quirks and external display quirks
I just don't understand why people go through all these lengths to be such sycophants for Apple.
Twice my battery was flat. Now I just do a complete shutdown, since Asahi boots in ~30s (M1 Pro).
Why does battery get drained in sleep at different rates? Is it a connected standby mode?
Afaiu, there are different levels of sleep and Linux doesn’t support all of them fully on Macs at the moment.
I've been an Asahi user since the early stages of the project when it used Arch. Today, I run Fedora Asahi Remix on a Mac Studio M1 Ultra with the Sway desktop, and it truly has been the perfect Linux workstation in every way.
https://github.com/jasoneckert/sway-dotfiles/blob/main/Asahi...
hey have you ever tried compiling the linux kernel on it? It's often difficult in my experience to find compile benchmarks for Apple Silicon that aren't Xcode
How's the battery life?
The battery life is terrible - as soon as I unplug my Mac Studio, it basically just shuts down.
I laughed.
I worked in an office with sketchy power.
The lights would flicker and the non-macs would turn off. The Macs just carried on. Mini, iMac, Mac Pro.
Yay for big caps I guess?
(2024).
For those curious about the Alkeria line-scan camera, he wrote a blog about 3d printing a lens mount etc. https://daniel.lawrence.lu/blog/2024-08-31-customizing-my-li...
Seems like a crazy hobby to me though! Photography is inconvenient enough without having to make your own mounts and use an sdk to do it! History is filled with inconvenient hobbies though.
I would agree with the sentiment about the lack of good bright screens for lenovo's hacker laptops like the X1 carbon.
256gb ssd as the minimum spec is criminal in my opinion.
Why? I never use that much on my laptops. My desktop sure, but that’s what the cloud is for.
Why? Lots of people more or less use their computer as a glorified web browser, with some zoom calls and document editing thrown in for good measure. 256gb seems overkill. My girlfriend is somehow still rocking a 2011 MacBook Air. She mostly just uses it for internet banking and managing her finances. Why would she want more than 256gb?
1Tb m.2 SSD cost 70 USD in summer 2025, and probably much less when bought in bulk as a chip. It doesn't make sense to install anything less than 1Tb in an expensive premium laptop. Or it should be upgradeable.
Apple's pricing is one of the reasons I am not going to buy their laptops. Expensive, and with no upgradeable or replaceable parts. And closed-source OS with telemetry.
> Lots of people more or less use their computer as a glorified web browser
For this purpose they can buy $350 laptop with larger screen.
Because the price tag is quite high to get as much storage as you would 15 years ago for about the same money.
I agree that many people use them as glorified internet machines but even then when they occasionally decide to back up some photos or edit a few videos the 256GB non-upgradable storage quickly becomes a limitation.
Price matters. 256GB is fine on a $500 web browsing laptop, but on a $1000+ one it's just a bad deal in 2025, even ignoring the fact that you cannot upgrade it later (it's soldered in place).
Possibly, but I don't see why those people would buy a new MacBook rather than a used 100$ laptop (which would be both better for their finances but also for the planet...)
Have you ever used windows on a $100 second hand laptop?
Imagine for a second that you don't know much about computers. You buy something crap like that and turn it on. Windows is of course already installed. Along with 18 antivirus programs and who knows what other junk. The computer will run dog slow. Even if you get rid of all the preinstalled programs, it'll run horribly slowly.
My mum has a computer from her work. Its pretty recent - worth way more than $100. It takes about 5-10 seconds for zoom or google chrome to start. And about 15 seconds for outlook to open. Its an utterly horrible experience.
If you can afford it, you'll have a way better experience on a macbook air from the last few years. In comparison, everything starts instantly. The experience is fantastic. Premium, even.
Personally I think its criminal that cheap laptops run modern software so poorly. Its just laziness. There's no reason for the experience to be so horrible. But the world being what it is, there is plenty of reasons to spring for a $1000 macbook air over a $100 second hand windows crapbook if you can afford it. Even if you don't do much with the computer.
> there is plenty of reasons to spring for a $1000 macbook air over a $100 second hand windows crapbook if you can afford it
Plus you can pick up a used M1 MacBook Air for as little as $300 these days. Despite being 5 years old, it'll still smoke anything on the PC side much under a grand, in terms of responsiveness.
Battery life?
Go for a midrange Chromebook then. It's all my wife uses, cost about $250 and has better battery life than most laptops on the market.
I don’t know, I kind of like 10 hrs on battery with normal usage and screen fully lit on a 15” screen while not being bulky. Virtually no contenders in that space.
To think that they had the audacity to sell 8GB RAM too
Is it possibly because 256GB is the minimum spec of the MacBook Air M2?
I think they mean that is 2025, 256GB is unreasonably small. Which is true, Apple wants to up-charge hundreds of dollars just to get to the otherwise standard 1TB drive.
Realistically, it is reasonable to expect 2TB drives, based on normal progression https://blocksandfiles.com/2024/05/13/coughlin-associates-hd...
That could be, it totally wasn't clear to me that they meant that was unreasonably small hardware vs that's unreasonable for Asahi to say was the minimum you could run on.
Honestly, I suspect there are a whole lot of people that could be perfectly happy with 256GB storage on an Air. I mean, sure, I got 2TB for my MBP 5 years ago, but my father-in-law is likely never going to need even close to 256GB on his, which is basically being a Chromebook for him.
I just bought a 2TB drive over the holidays and it was $250, so it's not like they're an insignificant percentage of a <$1K laptop.
From a supply perspective, 256GB seems ridiculous because you can get way more capacity for not very much money, and because 256GB is now nowhere close to enough flash chips operating in parallel to reach what is now considered high performance.
But from a demand perspective, there are a lot of PC users for whom 256GB is plenty of capacity and performance. Most computers sold aren't gaming PCs or professional workstations; mainstream consumer storage requirements (aside from gaming) have been nearly stagnant for years due to the popularity of cloud computing and streaming video.
Did someone do a deep dive on why battery life is so awful on Linux? Or is it some Ashai's driver's inefficiencies that causing this?
Each controller and subcomponent on the motherboard needs a driver that correctly puts it into low power and sleep states to get battery savings.
Most of those components are proprietary and don't use the standard drivers available in Linux kernel.
So someone needs to go and reverse engineer them, upstream the drivers and pray that Apple doesn't change them in next revision (which they did) or the whole process needs to start again.
In other words: get an actually Linux supported laptop for Linux.
> In other words: get an actually Linux supported laptop for Linux.
40% battery for 4hrs of real work is better than pretty much any linux supported laptop I've ever used
One of my favorite machines was the MacBook Air 11 (2012). This was a pure Intel machine, except for a mediocre Broadcom wireless card. With a few udev rules, I squeezed out the same battery performance from Linux I got from OS X, down to a few minutes of advantage in favor of Linux. And all this despite Safari being a marvel of energy efficiency.
The problem with Linux performance on laptops boils down to i) no energy tweaks by default and ii) poor device drivers due to the lack of manufacturer cooperation. If you pick a machine with well supported hardware and you are diligent with some udev rules, which are quite trivial to write thanks to powertop suggestions, performance can be very good.
I am getting a bit more than 10 hours from a cheap ThinkPad E14 Gen7, with a 64 Wh battery, and light coding use. That's less than a MacBook Air, where I would be getting around 13-14 hours, but it's not bad at all. The difference comes mainly from the cheap screen that is more power consuming and ARMs superior efficiency when idling.
But I prefer not to trade the convenience and openness of x86_64 plus NixOS for a bit more battery range. IMHO, the gap is not sufficiently wide to make a big difference in most usage scenarios.
The need to tweak that deeply just to get “baseline” performance really stings, though, particularly if you’re not already accustomed to having to do that kind of thing.
It’d be a gargantuan project, but there should probably be some kind of centralized, cross-distro repository for power configuration profiles that allows users to rate them with their hardware. Once a profile has been sufficiently user-verified and is well rated, distro installers could then automatically fetch and install the profile as a post-install step, making for a much more seamless and less fiddly experience for users.
Not trying to discredit your idea, but I feel like you're misrepresenting the amount of optimization apple does by calling it baseline.
It's generally the most optimized system down to the fact that Apple controls everything about it's platform.
If that's considered baseline, then nothing but full vertical integration can compete
For some laptops, this applies in comparison to Windows, too though (see elsewhere in thread for examples).
The advantage of Apple is that they deal with a tiny amount of hardware. Vertical integration enables aggressive optimizations.
I agree that in case of Linux, a udev rule generator would be a fantastic step ahead in terms of usability.
> 40% battery for 4hrs of real work is better than pretty much any linux supported laptop I've ever used
Not sure what "real work" is for you, but I regularly get more than 12 hours of battery life on an old Chromebook running a Linux and the usual IDEs/dev tooling (in a Crostini VM). All the drivers just work, sleep has no detectable battery drain. It's not a workstation by any means, but dual core Intel's are great for Python/Go/TypeScript
Out of curiosity, does Google contribute the drivers for Chromebook hardware to Linux upstream or do they keep it for themselves? Could it be that they just choose the hardware that works very well out of box with Linux?
I have no idea if there's upstreaming, but the Chomium OS repo is open source so you could check.
I don't know if that would help the wider Linux laptop community, because Chromebook OEMs can only select from a small list of CPU & chipset hardware combinations blessed by Google
What's the bar here? My Thinkpad X270 gets about 16 hours under Ubuntu with swaywm.
If we really want to get pedantic, its internal battery means the external pack is hot-swappable, so I can actually get several days on a "single charge." Good machine for camping trips.
[flagged]
Please don't cross into personal attack, name-calling, or cross-examination. This is in the site guidelines: https://news.ycombinator.com/newsguidelines.html.
I see how the GP comment could be provocative, but on this site we want responses that dampen provocation, not amplify it.
>In other words: get an actually Linux supported laptop for Linux.
For a lot of people the point is to extend the life of their already-purchased hardware.
Linux might work with your hardware, but it might not work well.
If your vendor is hostile like Apple, it will be hard to make it keep on working.
Is apple really hostile, though?
For the boot process, apple went out of its way to support booting alternative OSs without compromising on security.
And mind you, most other laptops are no friends either. E.g. the often beloved ThinkPads have a bunch throttling issues on non-windows OSs.
Noone is doing that with ARM MacBooks.
Yet. Plenty of people have with Intel ones - I’m one of them. My first experience with Linux was on a 2016 MBpro. And inevitably people will do the same with the silicon Macs, likely using Asahi it seems.
Why are some of y'all so hostile to this idea?
It's not inevitable. That's not what that word means.
Intel Macs supported Linux because they used Intel's Linux drivers and supported bog-standard UEFI. There are no preexisting drivers or DeviceTree files published by Apple for Linux. There is no UEFI implimentation, just a proprietary bootloader that can be updated post-hoc to deny booting into third-party OSes.
> Why are some of y'all so hostile to this idea?
I would love for Linux to support as many ARM devices as possible. Unfortunately, it requires continuous effort from the OEM to be viable. I've bought Qualcomm, Rockchip and Broadcom boards before, none of them have been supported for half as long as my x86 machines are. Nevermind how fast ARM architectures become obsolete.
It feels like Apple is really the only hostile party here, and they coincidentally decide whether or not you get to use third-party OSes.
It is inevitable. I guarantee you there will be people who run Linux on their silicon Macs. I don’t know how you could possibly hold a stance that no one ever will.
Apple is very hostile to it. It won’t stop everyone though. It’ll continue to be niche but it’s happening.
It's not inevitable. It's fragile. Go boot up your old iPad; that should be well-studied, right? We ought to know how to boot into Linux on an ARM machine that old, it's only fair.
Except, you can't. The bootloader is the same iBoot process that your Apple Silicon machine uses, with mitigations to prevent unsigned OSes or persistent coldboot. All the Cydia exploits in the world won't put Linux back on the menu for iPhone or iPad users. And the same thing could happen to your Mac with an OTA update.
It is entirely possible for Apple to lock down the devices further. There's no guarantee they won't.
> with mitigations to prevent unsigned OSes
There is literally an apple-developed way to boot securely into alternative OSs. How would asahi work otherwise?
Also, if only apple is hostile, where is my Xbox/ps/switch/any random Android tablet/million other device's running Linux?
> There is literally an apple-developed way to boot securely into alternative OSs
This is not a good thing! You do not want a proprietary bootloader as your only way to launch Linux, it's not a safe or permanent solution. Apple Silicon could have implemented UEFI like the previous Macs did, but Apple chose to lock users into a bootloader they controlled. This is markedly different from most ARM device bootloaders which aren't changed by an OTA update in another OS partition.
> if only apple is hostile
I did not say that at any point in my comment. Forgeties original claim was that Apple Silicon would eventually have support comparable to Intel Macbooks on Linux. I am telling them point-blank that it is impossible, because Apple and Intel have fundamentally different attitudes towards Linux.
> where is my Xbox/ps/switch/any random Android tablet/million other device's running Linux?
Your Switch and Android tablet is already running the Linux kernel. The past 2 generations of Xbox and PS chipsets have upstream support from AMD in the Linux kernel, so you really only need a working bootloader to get everything working.
Ironically, this does mean that the Nintendo Switch has more comprehensive Linux support than Apple Silicon does.
Sure, the kernel. But you surely know that android abstracts away the drivers, so without the proprietary drivers you are back to square zero - it's not "GNU/Linux".
But you didn't say GNU/Linux, which surely you know is an arbitrarily high bar for any hardware to attain.
My point still stands, Apple does not trust their customers.
Sigh.
Apple cannot lockdown the Mac. You can’t have a development machine that is incapable of running arbitrary code. Back when they still did WWDC live they said that software development was the biggest professional bloc of Mac users. I’m certain that these days development is the biggest driver of the expensive Macs. No one has ever made a decent argument as to why Apple would lock down the Mac that would also explain why they haven’t done it yet.
Passivity isn’t hostility. There isn’t any evidence that Apple is considering locking down the Mac. They could have easily done that with the transition to their own silicon but they didn’t despite the endless conspiracy theories.
Apple can lockdown the Mac. You might not think it is likely, but without UEFI there is no path of recourse if Apple decides to update iBoot. How do you launch Asahi if Apple quits reading the EFI from the secure partition?
> They could have easily done that with the transition to their own silicon
They already did, that's what my last comment just outlined. Macs do not ship with UEFI anymore, you are wholly at the mercy of a proprietary bootloader that can be changed at any time.
We have been talking about laptops from the very beginning. I don’t know why you keep talking about iPads.
Because the Apple laptops use exactly the same architecture and bootloader as iPads - if you think they're separate you don't know enough to be part of this debate.
They made significant changes to the bootloader with the explicit goal of allowing boot of third-party operating systems.
Unless you can find a way to implement U-Boot on Apple Silicon, they can make more significant changes with no easy opt-out.
> if you think they're separate you don't know enough to be part of this debate.
If this is the direction the conversation is headed in then I’m done. Have a good weekend.
That's an admirable goal, but, depending on the hardware, it can run into that pesky thing called reality.
It's getting very tiresome to hear complaints about things that don't work on Linux, only to find that they're trying to run it on hardware that's poorly supported, and that's something they could have figured out by doing a little research beforehand.
Sometimes old hardware just isn't going to be well-supported by any OS. (Though, of course, with Linux, older hardware is more likely to be supported than bleeding-edge kit.)
> It's getting very tiresome to hear complaints
This is very true. I've been asked by lots of people "how do I start with Linux" and, despite being 99.9% Linux user for everything everyday, my advice was always:
1. Use VirtualBox. Seriously, it won't look cool, but it will 100% work after maybe 5 mins mucking around with installing guest additions. Also snapshots. Also no messing with WiFi drivers or graphics card drivers or such.
2. Get a used beaten down old Thinkpad that people on Reddit confirm to be working with Linux without any drivers. Then play there. If it breaks, reinstall.
3. If the above didn't make you yet disinterested, THEN dual boot.
Also, if you don't care about GUI, then use the best blessing Microsoft ever created - WSL, and look no further.
I've never gotten along too well with virtualization, but would second the ThinkPad idea, or something similar. Old/cheap machine for tinkering is a good way to ease in, and I think bare metal feels more friendly.
I'd probably recommend against dual booting, but I understand it's controversial. I like to equate it to having two computers, but having to fully power one off to do anything* on the other one. Torrents stop, music collection may be inaccessible depending on how you stored it, familiar programs may not be around anymore. I dual booted for a few years in the past and I found it miserable. People who expected me to reboot to play a game with them didn't seem to understand how big of an ask that really was. Eventually things boiled over and I took the Windows HDD out of that PC entirely. Much more peaceful. (Proton solves that particular issue these days also)
That being said, I've had at least two friends who had a dual boot due to my influence (pushing GNU/Linux) who ended up with some sort of broken Windows install later on and were happy to already have Ubuntu as an emergency backup to keep the machine usable.
*Too old might be a problem these days with major distros not having 32bit ISOs anymore
I went 100% bazzite back in April/May, no windows, and I couldn’t be happier. The pc I built is basically 90% gaming/movies/hanging with friends, 10% browser tasks. Very easy to live this life if you don’t have particular professional needs IMO. When I was doing more freelance editing this really would not have been an option as resolve studio does not work well on Linux.
WSL supports GUI apps now. They open up just like any other GUI app on Windows.
I've tried this once for IntelliJ to work around slow WSL access for Git repos. Was greeted by missing fonts and broken scaling on the intro screen. Oops. But probably I was just unlucky, it might work well for most.
1. Linux isn't a panacea for depreciated hardware, and it never will be.
2. If your priority is system lifespan, you are already using OEM macOS.
1. I dunno about a panacea, but it's pretty great for old hardware. My 2011 desktop still runs Alpine Linux just fine.
2. By all means start with macOS, but eventually Apple will stop supporting your machine. And y'know what will still work and get updates then? Linux.
> but it's pretty great for old hardware
Which old hardware? You're circling around to the grandparent's point again; Linux support is hardware dependent.
> And y'know what will still work and get updates then?
No, I don't. Depreciated iPads lay dead in piles, and they don't run Linux for shit. You want me to believe the M4 will graduate to the big leagues?
Never said it was “panacea for depreciated hardware.” I’m saying it’s a common use case.
Every thread about Linux inevitably someone says “it gave new life to my [older computer model].” We’ve all seen it countless times.
And there it is: https://news.ycombinator.com/item?id=46387364
It's a common use-case for x86 machines that implement UEFI. Taking the iPhone and iPad into account, it is a nonexistent use-case for mobile ARM chipset owners.
I know you may have a particular axe to grind here, but Android devices are not a whole lot more likely to let you boot a vanilla linux distro. Apart from a handful of explicitly linux-compatible smartphones, the boot loaders tend to be pretty locked down, and the drivers all propietrary too
>taking the iPhone and iPad into account
This post is about the MacBook Air M2. The discussion has been about silicon MacBooks - laptops - from the start.
Apple does tons of optimizations for every component to improve battery life. Asahi Linux, which is reverse engineered, doesn't have the resources to figure out each of those tricks, especially for undocumented proprietary hardware, so it's a "death by a thousand cuts" as each of the various components is always drawing a couple of milliwatts more than on macOS.
It absolutely is not awful. You are doing something wrong then. It's not as good as on macOS of course but it's still great. I get 8-10 hours.
Eh it's pretty awful. I get 8 hours, yes, but in Linux, those 8 hours are ticking whether my laptop is sleeping in my bag or on my desk with the lid closed or I'm actively using it. 8 hours of active use is pretty good, but 8 hours in sleep is absolutely dreadful.
Exactly. This myth keeps being perpetuated, for some reason.
I'm typing this from a ThinkPad X1 Carbon Gen 13 running Void Linux, and UPower is reporting 99% battery with ~15h left. I do have TLP installed and running, which is supposed to help. Realistically, I won't get around 15h with my usage patterns, but I do get around 10-12 hours. It's a new laptop with a fresh battery, so that plays a big role as well.
This might not be as good as the battery life on a Macbook, but it's pretty acceptable to me. The upcoming Intel chips also promise to be more power efficient, which should help even more.
For optimal battery life you need to tweak the whole OS stack for the hardware. You need to make sure all the peripherals are set up right to go into the right idle states without causing user-visible latency on wake-up. (Note that often just one peripheral being out of tune here can mess up the whole system's power performance. Also the correct settings here depend on your software stack). You need to make sure that cpufreq and cpuidle governors work nicely with the particular foibles of your platform's CPUs. Ditto for the task scheduler. Then, ditto for a bunch of random userspace code (audio + rendering pipeline for example). The list goes on and on. This work gets done in Android and ChromeOS.
Asahi doesn't yet support all the CPU power states etc. This is a known limitation, not sure how easy it is to reverse engineer though.
This is the case with most (all?) laptops running Linux regardless of hardware unfortunately.
This doesn't match my experience. My previous three laptops (two AMD Lenovo Thinkpads, one Intel Sony VAIO) had essentially the same battery life running Linux as running Windows.
Does that mean that Windows's battery life suck equally on these devices?
You're lucky, my thinkpad x13 gen 2 AMD gets 5 hours on modern Fedora vs. 9 or 10 on Windows.
I also have an X13 Gen2 AMD. My idle power consumption is 2.5W to 4W depending on brightness. This ends up in 12h-15h (machine/battery ist 2y old I think).
I think you can improve your power settings.
I have an AMD thinkpad and get maybe 1/4 the battery life on Linux as I do when I boot inti Windows, did you have to do any tweaking to achieve that?
I typically install and enable tlp [1], but that's it. Some distros/DEs might have it out of the box, but on Arch I had to do it myself.
[1] https://wiki.archlinux.org/title/TLP
TLP, as mentioned. Also powertop for more in-depth view of what's happening on power consumption front.
I think MacOS was implied...
Have you ever put MacOS on a PC laptop? Terrible hardware support and the worst battery life of any OS.
I used to hackintosh every laptop I could get my hands on that could do it, and always saw better battery life on OS X vs. Windows.
Hey! I actually wrote a thing to make the Swaybar a little more "complete" (e.g. battery status, currently selected program, clock, inspirational quote from ChatGPT, etc): https://git.sr.ht/~tombert/swaybar3
Not going to claim it will change the world or anything, but this runs perpetually with Sway and according to System Monitor it hovers at a little less than a megabyte of RAM. You can set how often you want things to update, and add as many sections as you'd like, and it's easy to create extra modules if you are so inclined (though not as easy as the Clojure version since I haven't found an implementation of multimethods for Rust that I like as much).
Putting swaybar at the top behind the notch is a great idea!
A new Wayland protocol is in the works that should support screen cutout information out of the box: https://phosh.mobi/posts/xdg-cutouts/ Hopefully this will be extended to include color information whenever applicable, so that "hiding" the screen cutout (by coloring the surrounding area deep black) can also be a standard feature and maybe even be active by default.
Wayland modularity is the gift that keeps on giving.
You can't be serious. Wayland is the opposite of modular, and the concept of an extensible protocol only creates fragmentation.
Every compositor needs to implement the giant core spec, or, realistically, rely on a shared library to implement it for them. Then every compositor can propose and implement arbitrary protocols of their own, which should also be supported by all client applications.
It's insanity. This thing is nearly two decades old, and I still have basic clipboard issues[1]. This esoteric cutouts feature has no chances of seeing stable real-world use in at least a decade from now.
[1]: https://bugs.kde.org/show_bug.cgi?id=466041
Shh...you're not supposed to mention these things alas you be down voted to death.
I also have tremendous issues with Plasma. Things such as graphics glitching in the alt+tab task switcher or Firefox choking the whole system when opening a single 4k PNG image. This is pre-alpha software... So back to X11 it is. Try again in another decade or two.
YMMV and all, but my experience is that Wayland smoothness varies considerably depending on hardware. On modernish Intel and AMD iGPUs for example I’ve not had much trouble with Wayland whereas my tower with an Nvidia 3000 series card was considerably more troublesome with it.
As a user...why would I care?
If my Ferrari has an issue with the brakes and I go to my dealer I don't care if the brakes were by Brembo.
Blaming the vendor and their drivers is just trying to shift the blame.
Generally true, though this particular case is due to a single company deciding to not play ball and generally act in a manner that's hostile to the FOSS world for self-serving reasons (Nvidia).
I don't even think it's even that. These bugs seem like bog standard bugs related to correct sharing of graphics resources between processes and accessing with correct mutual exclusion.Blaming NV is likely just a convenient excuse.
> my tower with an Nvidia 3000 series card was considerably more troublesome with it.
I think you're describing a driver error from before Nvidia really supported Wayland. My 3070 exhibited similar behavior but was fixed with the 555-series drivers.
The Vulkan drivers are still so/so in terms of performance, but the smoothness is now on-par with my Macbook and Intel GNOME machine.
My 3080 still has the occasional hiccup, but from what I've read it's from vsync colliding with Nvidia's gsync?
The thing is that I'm not experiencing this clipboard issue on Plasma, but on a fresh installation of Void Linux with niri. There are reports of this issue all over[1][2][3], so it's clearly not an isolated problem. The frustrating thing is that I wouldn't even know which project to report it to. What a clusterfuck.
I can't go back to X11 since the community is deliberately killing it. And relying on a fork maintained by a single person is insane to me.
[1]: https://old.reddit.com/r/hyprland/comments/1d4s9bw/ctrlc_ctr...
[2]: https://old.reddit.com/r/tuxedocomputers/comments/1i9v0n7/co...
[3]: https://old.reddit.com/r/kde/comments/1jl6zv7/why_does_copyp...
> since the community is deliberately killing it
No one is killing it. No one willing to work on it is a very very different thing, and it's very bad faith and needlessly emotional to attribute malice to a lack of support.
What's in very bad faith is twisting the words of the people who work on these projects[1], and blaming me for echoing them.
It's very clear from their actions[2][3] that they have been actively working to "kill" X11.
There are still people willing to work on it, hence the XLibre fork. The fact that most mainstream distros refuse to carry it is another sign that X11 is in fact being actively "killed".
[1]: https://mastodon.social/@alatiera/114661446785833161
[2]: https://blogs.gnome.org/alatiera/2025/06/08/the-x11-session-...
[3]: https://www.phoronix.com/news/RHEL10-Removing-X.Org
Re X11 maintenance... I think it's mostly "done" and doesn't really need a lot of work. So not sure I see a problem there.
Far from it. The recent XLibre release[1] has a long list of bugfixes and new features.
Besides, isn't the main complaint from the Wayland folks that X11 is insecure and broken? That means there's still a lot of work to be done. They just refuse to do it.
To be fair, X11 has worked great for me for the past ~20 years, but there are obvious improvements that can be made.
[1]: https://github.com/X11Libre/xserver/releases/tag/xlibre-xser...
> Every compositor needs to implement the giant core spec
Is it giant or modular now?
How can Wayland be the opposite of modular and too extensible at the same time?
Because one property doesn't guarantee the other. A modular system may imply that it can be extended. An extensible system is not necessarily modular.
Wayland, the protocol, may be extensible, but the implementations of it are monolithic. E.g. I can't use the xdg-shell implementation from KWin on Mutter, and so on. I'm stuck with whatever my compositor and applications support. This is the opposite of modularity.
So all this protocol extensibility creates in practice is fragmentation. When a compositor proposes a new protocol, it's only implemented by itself. Implementations by other compositors can take years, and implementations by client applications decades. This is why it's taken 18 years to get close to anything we can refer to as "stable".
> E.g. I can't use the xdg-shell implementation from KWin on Mutter, and so on.
Why not? It's open-source software. Depending on your architecture you may be able to reuse parts of it.
But as a more flexible choice, there is wlroots.
> and implementations by client applications decades.
Toolkits implement these stuff, so most of the time "support by client application" is a gtk/qt version bump away.
> This is why it's taken 18 years to get close to anything we can refer to as "stable".
Is it really fare to compare the first 10 years of a couple of hobby developers with the current "wide-spread" state of the platform? If it were like today for 18 years and fail to improve, sure, something must be truly problematic. But there were absolutely different phases and uptake of the project so it moved at widely different speeds.
> Why not? It's open-source software. Depending on your architecture you may be able to reuse parts of it.
"The system is not modular, but you can make it so."
What a ridiculous statement.
> But as a more flexible choice, there is wlroots.
Great! How do I use wlroots as a user?
> Toolkits implement these stuff, so most of the time "support by client application" is a gtk/qt version bump away.
Ah, right. Is this why Xwayland exists, because it's so easy to do? So we can tell users that all their applications will continue to work when they switch to Wayland?
> Is it really fare to compare the first 10 years of a couple of hobby developers with the current "wide-spread" state of the platform?
It's not fare, you're right. I'll wait another decade before I voice my concerns again.
You can see the same problem in the XMPP world, with a lot of the extensions implemented only by a few applications. But at least most XMPP extensions are designed to be backwards-compatible with clients that don't support them.
Isn't this something KDE-specific and because of using third-party program to manage clipboard? On Gnome, there is no Klipper and copy-paste works.
Omg I thought I was going senile... copy & paste not working... I definitely pressed ctrl-c didn't I?
Bloody Wayland.
What do you mean it not working? It has been working forever.
You know what OS doesn’t handle the notch? OSX. It happily throws the system tray icons right back there, with an obscure work around to bring them back. Software quality at Apple these days…
Nitpick: it’s called macOS since 2016.
Is there a difference from 2024? Is the M2 still a good choice for Linux? I don’t mind older generations, I’m used to be a bit behind in terms of hardware as a tradeoff for good Linux support.
I used to enjoy the X line of ThinkPads but nowadays I don’t see a point going for them anymore, as the things I appreciated about them are slowly being phased out.
There is no support really for Linux on the M3+, nor should anyone expect the situation to change now that the main devs have moved on.
If you would be happy with a M1/M2 laptop knowing full well that it is a dead end and you will never have another Mac laptop with Linux support (the default assumption at this point), then yes it is a great machine.
> main devs have moved on
How confident are you in this statement? I have no particular knowledge of Asahi. But I do know this narrative emerged about Rust-for-Linux after a couple of high-profile individuals quit.
In that case it was plainly bogus but this was only obvious if you were somewhat adjacent to the relevant community. So now I'm curious if it could be the same thing.
(Hopefully by now it's clear to everyone that R4L is a healthy project, since the official announcement that Rust is no longer "experimental" in the kernel tree).
I know Asahi is a much smaller project than R4L so it's naturally at higher risk of losing momentum.
I would really love Asahi to succeed. I recently bought a Framework and, while I am pretty happy with it in isolation... when I use my partner's M4 Macbook Air I just think... damn. The quality of this thing is head and shoulders above the rest of the field. And it doesn't even cost more than the competition. If you could run Linux on it, it would be completely insane to use anything else.
Someone should create a minimal, nearly-headless macOS distribution (similar to the old hackintosh distros) that bootstraps just enough to manage the machine's hardware, with no UI, and fires up the Apple virtualization framework and a Linux VM, which would own the whole display.
It's similarly bogus here. Early Asahi development tried to upstream as much as possible but ultimately still maintained a gigantic pile of downsteam patches, which wasn't a sustainable model.
Most of current development is focused on reducing that pile to zero to get things into a tractable state again. So things continue to be active, but the progress has become much less visible.
M2 to M3 was a complete architectural change that will require a lot of reverse engineering. As far as I know no one is working on this. The M1/M2 work was a labor of love of largely one dev that has since moved on.
The project is still active and working to upstream the work of these devs. But as far as I know, no NEW reverse engineering is being done. Ergo, it’s a dead end.
Would be happy to be proven wrong.
The idea that a group of people would spend so much of their time trying to get linux to work on Apple hardware through reverse engineering always seemed absolutely crazy to me. I would never consider buying Apple hardware precisely because it doesn't support linux and the work they put in achieves nothing because the risk will always remain that they will lock the hardware further. Nevermind the fact that they will likely never fully reverse engineer all the components.
It just seems like a completely pointless endeavor... perhaps some people buy into it? why would anyone buy overpriced hardware with partial support that may one day be gone? the enhanced battery life doesn't really hold much appeal to me, and the arm architecture if anything is just another signal to stay away.
The only thing that makes sense to me is that they wanted the achievement on their resume, and in that given recent developments they succeeded?
The hardware isn’t overpriced, it’s best in class. It’s just that that class isn’t what you’re looking for, and as a Linux user it’s not for you, which is valid! But the hardware for what it is is one of the absolute best price to performance ratios on the market right now and I’m tired of people pretending it isn’t. You can get a brand new m4 MacBook Air for under $800 right now, and that’s simply one of the best deals around. For an M2 for asahi Linux? Second hand the prices are even better.
You overlooked the UTM app on the App Store (and open source available too), which wraps Apple Silicon virtualization excellently, or you can use Qemu (which I don't).
I used to use Asahi, but the sleep modes power drain was tedious.
With UTM, I install a latest Fedora ISO (declaring it a "Linux", which exposes the option to skip QEMU and use native Apple Silicon virtualization.
It's fantastic. I mention this only because it's been super useful, way better than Asahi, with minimal effort.
It's like Hackintosh all over again but with Apple hardware rather than their cursed software.
Maybe they just needed a hobby. I for one think it's a pretty cool one.
I had a pretty similar setup with sway too, idle power usage was pretty bad, but no external display became a deal breaker at some point.
What is the prospect for newer M support, e.g. M3, M4? I am hesitant to adopt something that doesn't work with current and future models.
Asahi is all reverse engineering. It’s nothing short of a miracle what has already accomplished, despite, not because of, Apple.
That said some of the prominent developers have left the project. As long as Apple keeps hoarding their designs it’s going to be a struggle, even more so now.
If you care about FOSS operating systems or freedom over your own hardware there isn’t a reason to choose Apple.
To be clear, the work the asahi folks are doing is incredible. I’m ashamed to say sometimes their documentation is better than the internal stuff.
I’ve heard it’s mostly because there wasn’t an m3 Mac mini which is a much easier target for CI since it isn’t a portable. Also, there have been a ton of hardware changes internally between M2 and M3. M4 is a similar leap. More coprocessors, more security features, etc.
For example, PPL was replaced by SPTM and all the exclave magic.
https://randomaugustine.medium.com/on-apple-exclaves-d683a2c...
As always, opinions are my own
This is what ruffles my jimmies about this whole thing:
> I’m ashamed to say sometimes their documentation is better than the internal stuff.
The reverse engineering is a monumental effort, this Sisyphean task of trying to keep up with never-ending changes to the hardware. Meanwhile, the documentation is just sitting there in Cupertino. An enormous waste of time and effort from some of the most skilled people in the industry. Well, maybe not so much anymore since a bunch of them left.
I really hope this ends up biting Apple in the ass instead of protecting whatever market share they are guarding here.
I strongly support a projects stance that you shouldn't ask when it will be done. But the time between the M1 launch and a good experience was less than the time since M3 I would love to know what is involved.
Have they though? Hector just added support for the power button, I wonder if he is officially back?
https://lore.kernel.org/asahi/20251215-macsmc-subdevs-v6-4-0...
That's an email from James Calligeros. All this patch says is that the author is Hector Martin (and Sven Peter). The code could have been written a long time ago.
The new project leadership team has prioritized upstreaming the existing work over reverse engineering on newer systems.
> Our priority is kernel upstreaming. Our downstream Linux tree contains over 1000 patches required for Apple Silicon that are not yet in upstream Linux. The upstream kernel moves fast, requiring us to constantly rebase our changes on top of upstream while battling merge conflicts and regressions. Janne, Neal, and marcan have rebased our tree for years, but it is laborious with so many patches. Before adding more, we need to reduce our patch stack to remain sustainable long-term.
https://asahilinux.org/2025/02/passing-the-torch/
For instance, in this month's progress report:
> Last time, we announced that the core SMC driver had finally been merged upstream after three long years. Following that success, we have started the process of merging the SMC’s subdevice drivers which integrate all of the SMC’s functionality into the various kernel subsystems. The hwmon driver has already been accepted for 6.19, meaning that the myriad voltage, current, temperature and power sensors controlled by the SMC will be readable using the standard hwmon interfaces. The SMC is also responsible for reading and setting the RTC, and the driver for this function has also been merged for 6.19! The only SMC subdevices left to merge is the driver for the power button and lid switch, which is still on the mailing list, and the battery/power supply management driver, which currently needs some tweaking to deal with changes in the SMC firmware in macOS 26.
Also finally making it upstream are the changes required to support USB3 via the USB-C ports. This too has been a long process, with our approach needing to change significantly from what we had originally developed downstream
https://asahilinux.org/2025/12/progress-report-6-18/
This is a very straightforward problem with a relatively simple solution:
Stop buying Apple laptops to run Linux.
Hard disagree : try the UTM app on the App Store (or build it from open source) and you get Apple Silicon native virtualization and super simple installation of Aarch64 linuxes from an iso.
i've been doing this for maybe a year, after frustration with power draw and sleep modes (and dual boot) with Asahi.
it's been great...and Apple silicon is still super efficient, which is why i said hard disagree.
The project is effectively dead
Given the speed of the progress that Apple has made on their hardware (from M1 to M5), I think the project was already doomed since the very beginning. Reverse engineering per-se is a huge talent drain that wastes tremendous amount of man-hour on a closed problem. Also, the strong SW-HW integration of Mac is sophisticate and fragile, that is difficult to analyze and replicate. Nailing all those details is not only time consuming, but also limited in the scope, and never yield anything beyond status quo.
I’m quite glad that those talented guys finally escaped from the pit hole of reverse engineering. It maybe fun and interesting, but its future was already capped by Apple. I wish they find another fashion, hopefully something more original and progressive. Stop chasing and push forward.
What why?
Very little progress made this year after high profile departures (Hector Martin, project lead, Asahi Lina and Alyssa Rosenzweig - GPU gurus). Alyssa's departure isn't reflected on Asahi's website yet, but it is in her blog. I believe she also left Valve, which I think was sponsoring some aspects of the Asahi project. So when people say "Asahi hasn't seen any setbacks" be sure to ask them who has stepped in to make up for these losses in both talent and sponsorship.
https://rosenzweig.io/blog/asahi-gpu-part-n.html
Has Hector left though?
https://lore.kernel.org/asahi/20251215-macsmc-subdevs-v6-4-0...
Marcan (Hector Martin) resigned from Asahi Linux early this year [0].
Asahi Lina, who also did tons of work on the Asahi Linux GPU development, also quit as she doesn't feel safe doing Linux GPU work anymore [1].
[0] https://marcan.st/2025/02/resigning-as-asahi-linux-project-l...
[1] https://asahilina.net/luna-abuse/
GP's LKML link is very recent unlike your two links, implying something could've changed.
I have no insight into the Asahi project, but the LKML link goes to an email from James Calligeros containing code written by Hector Martin and Sven Peter. The code may have been written a long time ago.
marcan and asahi lina is the same person
Someone posting Hectors code or a quote does not mean he didn't leave. I'm really not sure how that could leave that impression.
Because key developers have left the project, and developers who are capable of such work are few and far between.
>are few and far between
They are more common than you would think. There just is not many willing to work on a shoe string salary.
> There just is not many willing to work on a shoe string salary.
You explained it well by yourself.
It's really hard to do and nobody is paying for it?
Without official support, the Asahi team needs to RE a lot of stuffs. I’d expect it to lag behind a couple of generations at least.
I blame Apple on pushing out new models every year. I don’t get why it does that. A M1 is perfectly fine after a few years but Apple treats it like an iPhone. I think one new model every 2-3 years is good enough.
M1 is indeed quite adequate for most, but each generation has brought substantial boosts in performance in single-threaded, multi-threaded, and with the M5 generation in particular GPU-bound tasks. These advancements are required to keep pace with the industry and in a few aspects stay ahead of competitors, plus there exist high end users whose workloads greatly benefit from these performance improvements.
I agree. But Apple doesn’t sell new M1 chip laptops anymore AFAIK. There are some refurbished ones but most likely I need to go into a random store to find one. I only saw M4 and M5 laptops online.
That’s why I don’t like it as a consumer. If they keep producing M1 and M2 I’d assume we can get better prices because the total quantity would be much larger. Sure it is probably better for Apple to move forward quickly though.
In the US, Walmart is still selling the M1 MacBook Air new, for $599 (and has been discounted to $549 or better at times, such as Black Friday).
In general, I don't think it's reasonable to worry that Apple's products aren't thoroughly achieving economies of scale. The less expensive consumer-oriented products are extremely popular, various components are shared across product lines (eg. the same chip being used in Macs and iPads) and across multiple generations (except for the SoC itself, obviously), and Apple rather famously has a well-run supply chain.
From a strategic perspective, it seems likely that Apple's long history of annual iteration on their processors in the iPhone and their now well-established pattern of updating the Mac chips less often but still frequently is part of how Apple's chips have been so successful. Annual(ish) chip updates with small incremental improvements compounds over the years. Compare Apple's past decade of chip progress against Intel's troubled past decade of infrequent technology updates (when you look past the incrementing of the branding), uneven improvements and some outright regressions in important performance metrics.
> That’s why I don’t like it as a consumer. If they keep producing M1 and M2 I’d assume we can get better prices because the total quantity would be much larger.
Why would this be true? An M5 MacBook Air today costs the same as an M1 MacBook Air cost in 2020 or whenever they released it, and is substantially more performant. Your dollar per performance is already better.
If they kept selling the same old stuff, then you spread production across multiple different nodes and the pricing would be inherently worse.
If you want the latest and greatest you can get it. If an M1 is fine you can get a great deal on one and they’re still great machines and supported by Apple.
>I don’t get why it does that.
I've got a few ideas
too bad they archived this level of compatibility only on hardware you cannot easily access anymore.
Anyone know what the story is for running LLMs on apple hardware but using asahi?
AFAIK MPS cannot be used on Asahi, so it has to be done using Vulkan which will definitely be much slower.
Among other things, I quite like the blog, especially travelling photos. There’s some ‘old Internet / old blogosphere’ vibe in it.
Not much to add to the topic of having a MacBook Air M2 with Linux. Glad it works well, I’m eyeing an M1 one for myself. Yet my Retina (2014 model) works perfect with Arch Linux (including sleep), so I’m patiently waiting for it to break. And at the same me I hope it would work another decade, so good it is. Battery life is quite great, it’s between 3 to 4 hours with battery being at 50% of capacity. So, I expect the new battery could give me up to 8 hours, which is pretty impressive for me. In reality, I don’t need a session for over an hour or two. The only thing I miss is USB-C charging, as that way I could charge with anything when I have no charger on me. Again, in reality, it’s a pretty rare scenario.
author mentions he paid $750 for a MacBook Air M2 with 16GB while on Amazon a M4 Air with 16GB is usually $750-800. I get it that M4/M3 aren't supported to boot Asahi yet, but still.
I really wanted this to work, and it WAS remarkably good, but palm rejection on the (ginormous) Apple trackpad didn't work at all, rendering the whole thing unusable if you ever typed anything. That was a month ago, this article is a year old. I'd love to be wrong, but I don't think this problem has been solved.
Yeah what is up with that? When I've tried to look into it I've just been met with statements that palm rejection should pretty much just work, but it absolutely doesn't and accidental inputs are so bad it's unusable without a disable/enable trackpad hotkey.
It's a year old article.
the point still stands as last year the M4 was released and was already seeing those deals especially with the M3 earlier too.
No, because the M4 Air wasn't even out until March of this year. It was only in the iPad and MBP last year.
I mean for most purposes should be very similar so makes sense the price is similar
All Firefox users should switch to librewolf. In the short term it’s for telling Mozilla to go f**, in the long term it’s a browser fork with with really good anti fingerprinting.
Note that librewolf rely on Mozilla tech infra for account synchronization and plugin distribution. If you are truly hostile to this organization, is there another browser you can recommend?
Not that I disagree, but why make that point here?
They had Firefox in their dnf for asahi