If it works for you then use it, however if you want the latest packages you’ll have to NOT use the LTS releases in which case be prepared to do a FULL REINSTALL every time a new version comes out.
Or use the LTS but use Snaps for those applications that you want to have the latest versions of. Snaps are getting better and I think eventually you won’t notice the difference between them and native apps, except for the space they just up. But that goes for Flatpak too.
Personally I use Linux Mint Debian Edition because I’m not happy with the way Canonical is going. In most cases the “old” apps are fine for me, but if I felt need the newest version I’ll use a Flatpak.
Another rolling option is OpenSuse Tumbleweed however, being a Mac which uses proprietary WiFi drivers, your WiFi will break with kernel updates, which can be irritating, unless you have ethernet.
If it works for you then use it, however if you want the latest packages you’ll have to NOT use the LTS releases in which case be prepared to do a FULL REINSTALL every time a new version comes out.
This is just wrong. You can update the LTS release to the next non-LTS release. You only have to unchecked “LTS only”. You can also wait for the next LTS release.
You never need a full install. I haven’t done such a thing for a decade.
Well, from non-LTS, you can always go to +1, the next release. If this happens to be an LTS, sure, you will automatically be on LTS. (Then you can change your settings to say on LTS or keep tracking non-LTS release).
I really wouldn’t touch secondhand Ms. No upgrades, no repairs, horrible components (CPU is ok, everything else is straight from the dumpster in order to cover costs).
So when something dies on your device from a company that has a long history of terrible design and QA (I’m betting on storage) you have to pay another $1000+ to replace the whole motherboard. On top of that, I’m guessing that they’re also ripping off customers when selling those replacement boards, as having usable ram and storage costs an extra $1000+ when buying new.
I would never buy something new from Apple as I don’t like them, but I have to admit that their hardware feels great to use. I’m not tech savvy enough to know where that would be coming from, but it makes me wonder how people could say that the components are so bad.
My girlfriend has a 2012 MacBook Pro and I put Fedora on it and it feels like such a great machine. The ram and the hard drive have been upgraded, but it feels incredible for an old machine.
If in 10 years you can get an old MacBook Pro for 200$, I might jump on it even if upgradeability has been lowered.
I tried using a friend’s m1 MacBook pro, and it’s the worst laptop I’ve touched in a while. Like my oldest budget core2duo laptop has a better keyboard than a brand new $2000+ device. There’s a very good reason it’s permanently docked.
it makes me wonder how people could say that the components are so bad.
I’ve mentioned a few reasons in this thread. They basically used subpar components to offset the cost of developing their own CPU.
If in 10 years you can get an old MacBook Pro for 200$, I might jump on it even if upgradeability has been lowered.
It’s not lowered, it’s absolutely removed, unless you count replacing the entire motherboard as upgradeability.
It was a joke. But you can see why I wouldn’t be surprised if it turns out true in the end.
I know for a fact they’re making it impossible to make small repairs like changing the screen closed sensor. It requires a proprietary calibration tool they won’t sell, and so MacBooks can’t go to sleep when closing the screen.
On top of changing it from a sub $ hall sensor to some proprietary bs that’s far more expensive.
Haven’t dug into it yet, but if that’s right then not great. Then again if something doesn’t break quickly in electronics it usually works fine for years, except maybe overheated GPUs, random RAM and HDDs.
I’m still unsure if I want to replace my 2016 Asus zenbook. Other than the aged CPU/AGPU from Intel, and unusable from the start touchpad it’s fine.
When m1 came out, some tech guy on twitter did a review of MacBook Pro and studio storage. Apple literally used components that are so bad they had to disable data safety protocols to go above HDD speeds. The end result was that losing power is likely to corrupt your data.
Besides that apple was cutting out “unnecessary” parts of the arm specification in order to cut costs. The result is that the first 2(?) generations have hardware level exploit “m1racles” on top of others like “pacman”.
Legitimate repairability and pricing concerns aside, what parts exactly are you accusing of being straight from the dumpster? The GPU is insane for a low-power laptop, screen, speakers, trackpad are best in class. Keyboard is a matter of preference but by any objective measure it’s not bad, much improved from butterfly switches.
I would if the particular hardware had no inherent or user caused issues and the price was reasonable compared to other purchase candidates, but it rarely is. It would also need to be Linux compatible too because the os has always been insufferable and praised by insufferable people that need something to feel superior about with zero justification.
The PowerPC days were pretty crap though even though the hardware was visually pleasing. Nobody made PowerPC compatible software. This time I guess apple is paying fees to arm and at least has arm compatibility. x86 is irritating in its own right too. Man, tech has gone in all sorts of shitty directions.
they are downvoting you, but you’re absolutely right.
they can’t hardly be repaired and it’s impossible to upgrade them at all, even something as basic as swapping the SSD needs desoldering. They are still sold with 8 GB of RAM as the base and they can’t be upgraded.
Just don’t buy an 8gb model, easy fix) But seriously when you get a laptop which allows you to work 8 hours straight from battery and then have 30% capacity left at the end of a day, there is no chance you would get back to the Intel system and plug it in every 2 hours.
there is no chance you would get back to the Intel system and plug it in every 2 hours.
don’t be irrealistic. most laptops in the Macbook price range will have 8 hours of usage in low consumption mode or around 6 or 5 if you need more power.
and at that price point they come with at least 32 GB of RAM which can be upgraded, swappable SSDs with more capacity than the macbook’s, far better keyboard and more ports.
the Macbooks do have some extra performance per battery usage? yeah I guess. But after 2 years that the battery life is gone, you’ll probably be buying the newer model or wishing that you bought a laptop with a replaceable battery.
there is no chance you would get back to the Intel system and plug it in every 2 hours.
don’t be irrealistic. most laptops in the Macbook price range will have 8 hours of usage in low consumption mode or around 6 or 5 if you need more power.
While I completely agree on the repairability front, which is really quite unfortunate and quite frankly a shame (at least iPhones have been getting more repairable, silver lining I guess? damned need for neverending profits), it’s just… non unrealistic.
That being said, unified memory kind of sucks but it’s still understandable due to the advantages it brings, and fixed-in-place main storage that also stores the OS is just plain shitty. It’ll render all these devices unusable once that SSD gives out.
Anyhow, off the tangent again: I have Stats installed for general system monitoring, as well as AlDente to limit charge to 80% of maximum battery capacity. All that to say, by now after around 1.5 years of owning the M2 MacBook Air (which I’ve been waiting for to buy/to release since late 2019, btw), I know pretty well which wattages to expect and can gauge its power usage pretty well.
I’ll try to give a generalized rundown:
High-intensity workloads (mostly in shorter bursts for me): typically around 10W. I’ve installed Minecraft before once just to test it, and I get reasonable frames (both modded and unmodded), where it seemed to draw maybe 15W, thus still being able to charge (!) the battery off a 30W power supply. It doesn’t ever really go above 20W as a rule of thumb, and the CPU/GPU will be capable enough for easily 80-90% of the general population.
Idle/suspended: unnoticeable. I use my machine every day with maybe an exception or three per month, but from what I’ve read from others, battery will dip slightly after a month of standby, but that’s mostly due to battery chemistry I’d assume, not actually background usage.
Idle/running, light usage (yes it’s the same category*): It actually depends on the screen size edit: whoops, brightness. Energy consumption due to CPU usage is by far the minority portion. I’d say 2-4W, maybe. Screen usage when really bright makes it jump to 8-9W, darker-but-not-minimum screen brightnesses leave it at… 5W maybe.
Given the spec sheet’s 52 Wh battery, you can draw your own conclusions about the actual runtime of this thing by simple division. I leave it mostly plugged in to preserve the battery for when it becomes a couch laptop in around 5-8 years, so I can’t actually testify on that yet, I just know the numbers.
I didn’t mean for this to come off as fanboi-y as it did now. I also really want to support Framework, but recommending it universally from my great-aunt to my colleagues is not as easy as it is with the MacBook. Given they’re a company probably 1,000 times smaller than Apple, what they’re doing is still tremendously impressive, but in all honesty, I don’t see myself leaving ARM architecture anytime soon. It’s just too damn efficient.
*At least for my typical usage, which will be browser with far too many tabs and windows open + a few shell sessions + a (may or may not be shell) text editor, sometimes full-fledged IDE, but mostly just text editors with plugins.
The thermals and battery life of my Apple silicon MacBooks are unlike any other laptop I’ve owned. When I first got one, I started thinking of recharging it not in hours, but in days. 3-4 days between charges was normal for typical use. Mind you that was not full workdays, but the standby time was so good that I didn’t have to consider that the battery would decrease overnight or in my bag. I’ve used multiple Dell, Thinkpad, and Intel Mac laptops over the past decade as well and none of them come within spitting distance on battery life and thermals. I really hope that Qualcomm can do for other manufacturers what Apple silicon has done for MacBooks.
I did some actual measurements just to confirm it, here is minecraft in default configuration running @ 100fps and the cpu+gpu consumption is around 6w in total. If you add about 5w for display backlight and other components the total would be 9-10 hours of play time on my 100wh battery.
I have a 1½-year-old laptop AMD Ryzen 6860Z processor & get 9 hours on the regular running NixOS doing programming/browsing/chat. That’s not quite 8 hours with 30% to spare, but good enough that I don’t worry about carrying my charger (but being lightweight GaN, normally keep it in my bag just in case). Apple folks have this tendency to think all their hardware is massively better, but even if it’s ‘better’, it’s often just by a small margin that doesn’t make a big difference–especially when you factor in cost.
I did some actual measurements just to confirm it, here is minecraft in default configuration running @ 100fps and the cpu+gpu consumption is around 6w in total. If you add about 5w for display backlight and other components the total would be 9-10 hours of play time on my 100wh battery.
I don’t own Minecraft (nope to Microsoft-owned software) nor would I have a reason to do 3D gaming on a battery… are you gaming at a café, the library, or something?
no, it’s just an easy sustained load that can be measured accurately. If you have some other application that provides sustained load but doesn’t spin all the cores to 100% please suggest it, I will try.
For example when watching 1080p youtube video in Safari the power consumption is only 0.1watt because it’s using hardware decoders. (not including display backlight, I can’t measure it). But when I play the same video in firefox which is using software decoding the consumption is around 0.7w which is not as good as hw decoders, but still less than a watt
The 8GB models are manufactured e-waste but the usable lineup are great machines. They’re practically unrepairable, but they’re built not to need repairs.
If you care about swapping out the SSD or replacing the RAM, you shouldn’t buy Apple. I promise you, though, that 99% of laptop users don’t, and that includes a significant part of Linux users.
Macs are expensive as balls but there simply aren’t any competitors for them. They’re the “overkill everything” segment that’s too small to target for other manufacturers. There are maybe one or two series of laptops that come close in speaker quality, and one of those consists of gaming laptops designed after 80s scifi spaceships, and the other comes with terrible battery and even worse Linux support, and both of them lack the battery life+performance quality Apple managed to squeeze out of their CPU.
I wish someone would produce Macbooks other than Apple. It’s an awful company that produces great hardware for a competitive price, it you care about all the Macbook has to offer. And to be honest, that’s not because Apple is such an amazing manufacturer, it’s because AMD and Intel are behind the curve (Qualcom even more), and the laptop manufacturers that try to compete with Apple always try to squeeze just that little bit of extra cost cutting out of their models so their shit doesn’t cost more, and preload their top of the line hardware with Windows 11 Home (the one with candy crush pinned to the start menu) and their stupid GAMER software suite that works on three models and stops being maintained after two updates.
Current apple systems are objectively superior. The display image quality is better than competition, the touchpad hardware is better, CPU is top 1 in the world in single thread performance and the battery life is unrivaled.
If you talk about the repairability it only matters in case it breaks and it only happens to a small % of the owners. Most people won’t need to repair it. However you do use your device every day, so why would you give up the better user experience? Because of a small chance you would need to pay for repairs later, or even at all? It doesn’t make sense.
The same argument applies to upgrades as well. If you think you’ll need an upgrade just buy a bigger version from the start. It may be more expensive but once again you get a better experience overall.
Exactly the drivel they want you to believe. I’m sorry but even if 8gb of ram performs like 16gb on other computers, which is a load of hot shit, it shouldn’t cost more than 32gb on other computers. The markup on parts for basic specs config is utterly insane. I highly doubt the average apple used actually benefits from the top single thread performance, and all of humanity’s battery tech is still awful at it’s best both in capability and environmental impact, not to mention capability per dollar.
I have used apple hardware and software from the beige plastic days until the first laptops, and tried out Mac os and ios every major update, and found it to be entirely unenjoyable even when ignoring what is essentially DRM for hardware components on the phones forcing you to pay for repairs when it’s an easy diy otherwise.
Really the apple elitism is bizarre beyond belief. You get to be in the cool dude club for getting scammed into paying 3-5 times the cost of each individual component you can choose higher version of in there config before buying. It’s like the million dollar gold bar app on early app store, except apple wants to be the only one making that kind of easy money off of their users.
I wouldn’t say hated, but definitely possessing of many ui/UX choices that were about as well thought out as how windows had old/new control panel plus the new new settings app and yet everything was still counterintuitive. I merely gave it a chance repeatedly because I ran a computer business up until covid hit and needed basic familiarity, and people kept telling me it was better than everything else and really if you don’t game it mostly is for many workloads, but I still found things to be rather clunky especially system navigation on iOS. Not saying android is better or anything either because while it suits me more, there is so much infuriating dumb shit.
Basically because of every other offering feeling like it’s ripping me off, Linux being free and having tons of customization beyond simply cosmetic and several people making different solutions to each problem most of the time and also free, coming back to anything else with any combination of hardware, software, and money entry barriers just feels like the worst value proposition possible. Maybe if I was born into wealth and a social media addict I would have been an apple fanboy.
You talk about high prices however there is no actual competition. High end systems like Dell XPS and others cost the same as M3. You do get some benefits like touch screen or whatever but you get shitty touchpad and 3 hours of battery life.
In regards to the software I agree macOS is not the best, but maybe you noticed the topic is about Fedora Linux so you do have options now.
Their prices for RAM and storage upgrades are dogshit, but Macbooks do have objectively superior audio quality, and some of the best screens available. You just need to pretend the 256GB/8GB models don’t exist and the lineup suddenly makes a lot of sense.
Apple Silicon showed up to wipe the floor with Intel and AMD. Both now have CPUs that beat the M1/2/3, at the cost of huge power consumption and heat generation. With every non-Apple Macbook competitor, you can pick two out of “screen quality, audio quality, battery life, CPU performance” that perform well, and the rest plain doesn’t compete.
You won’t see me buy one of those things, the price is just soo goddamn high, but if you have the money to waste on these things, they’re excellent products. Especially when you’re a normal consumer and don’t plan on running Linux anyway; macOS may be janky as hell (“what’s window snapping?”) but your alternatives are Windows 11 and ChromeOS.
This is in contrast to the Intel Macbooks, which still had great screens and speakers, but were gimped by awful CPUs, comically insufficient cooling, self destructing keyboards, and so many other design flaws.
Thanks. Not full wayland protocol support and have a bugs, but something is greater than nothing. UPD: The utilization of the Internet channel has also increased
What kind of bugs are you running into? The original Waypipe proposal claimed that it was pushing less data than X. Let’s hope it gets faster in the future.
If you look at any modern desktop application, e.g. those built over GTK or QT, then they’re basically rendering stuff into a pixmap and pushing it over the wire. All of the drawing primitives made X11 efficient once upon a time are useless, obsolete junk, completely inadequate for a modern experience. Instead, X11 is pushing big fat pixmaps around and it is not efficient at all.
So I doubt it makes any difference to bandwidth except in a positive sense. I bet if you ran a Wayland desktop over RDP it would be more efficient than X11 forwarding. Not familiar with waypipe but it seems more like a proxy between a server and a client so it’s probably more dependent on the client’s use/abuse of calls to the server than RDP is when implemented by a server.
I asked chatGPT what Wayland is since the article contains no explanation
In this context, “Wayland” refers to a protocol and a display server protocol used in Linux operating systems. It’s an alternative to the more established X Window System (X11). The article highlights that Firefox version 121.0 has integrated support for Wayland by default, indicating that the browser can now utilize Wayland’s capabilities directly on modern Linux desktops without relying on XWayland compatibility layer, thereby enhancing performance and compatibility with the native display server protocol.
Explained by someone that doesn’t know the technical side super well.
1: It’s a new protocol for displaying. The main difference from X11, as I understand it, is a simplification of the stack. Eliminating the need for a display server, or merging the display server and compositor.
2: Some things impossible (or difficult) with X11 are much better supported in Wayland. Their not necessarily available, as the Wayland protocol is quite generic and needs additional protocols for further negotiation. Examples are fractional scaling & multiple displays with differing refresh rates.
Security is also improved. X11 did not make some security considerations (as it is quite old, maybe justifiably so). In X11 it’s possible for any application to “look” at the entire display. In Wayland they receive a specific section that they can draw into and use. (This has the side-effect of complicating stuff like redshifting the screen at night, but in my experience that has fully caught up).
3: If you’re interested, are in desktop application development (but I have no experience in that regard) or have a specific need for Wayland.
4: I think X won’t die for a long long time if “ever”. I’m not super familiar with desktop app development, but I don’t think it requires more work to keep supporting X.
On the other hand, most of the complaints about Wayland I’ve heard were ultimately about support. At some point, when you’re a normal user, the distro maintainer should be able to decide to move to Wayland without you noticing, apart from the blurriness being gone with fractional scaling.
I’m not super familiar with desktop app development, but I don’t think it requires more work to keep supporting X.
It doesn’t depend that much on desktop application developers, but on GUI toolkit developers. It does need more work for GTK and Qt devs to support both. But the outcome will likely depend not that much on ammount of work as on “political” decisions. RedHat are now somewhat actively forcing Wayland in their distros. They also have their impact on GNOME, so it’s not impossible that due RedHat’s decision GNOME and then GTK (that is now developed mostly by GNOME developers, despite being GIMP Toolkit initially) will ditch X “just because”.
End user Application developers usually don’t deal much with Wayland or X — they just use toolkits (GTK or Qt for the majority), and toolkits do all the under the hoof work for them.
Why do you think Ubuntu is the favourite distro at Microsoft? They’ve tried extinguishing Linux through suse, but are now back on the old EEE plan with canonical helping them.
My personal favorite is Debian. I'm the IT director at my job, and 90% of our machines, including end user workstations, are running some form of Linux.
One really nice thing is that most stuff is saved somewhere in your home directory. You can switch between all sorts of distros, and if you install the same software, browser, email client, etc. most of your stuff will automatically be there and work out of the box.
I didn’t know this for a long while when distro hopping and since every distro tinkered with grub etc and I really hated debugging grub, and I was afraid of something happening to my home directory, I overwrote it every single time. I wish I have had a separate drive just for it when I began with linux.
I was worried I would have to ask for a tl;dr for dummies like I’m 5, but everything is categorized nicely under questions one may have on the topic. It’s been a while since I’ve been able to get meaningful information from a website without a huge commitment.
linux
Newest
This magazine is from a federated server and may be incomplete. Browse more on the original instance.