Framework outlet is a great idea. I’ll add that I recently came across deals from Dell selling an XPS13 from 2020 for $449 and MSI selling an Intel powered laptop (no discrete GPU, almost office laptop) for $399. The older Thinkpads will be reliable though
It’s from the phrase “big wheel”, meaning a person with a lot of power/influence. Similar to “big cheese”… It would have been better to use “cheese” instead of “wheel” IMO.
Pretty sure it’s not. I saw something on this topic a few weeks ago but can’t quite remember. Iirc, it was a term in an early early OS, where a bit in memory was the privilege but and could be set or unset by turning a real wheel on the computer. This Stück with some people developing UNIX, so they called the wheel group wheel, but none of them are sure who came up with this.
What OP said. But here’s a more detailed answer courtesy of GPT-4:
Adding cat /dev/random > /dev/pty23 to your .profile would result in an interesting situation whenever you start a login shell.
Behavior of the Command: The command cat /dev/random continuously reads random data from the /dev/random device file, which generates an endless stream of random bytes. Redirecting this to /dev/pty23 means it attempts to write this data to the pseudo-terminal device /dev/pty23.
Impact on Shell Startup: When you add this to your .profile, every time you start a login shell (like when you open a new terminal session), it will execute this command. Since /dev/random produces an endless stream of data, the cat command will not terminate on its own. This means your shell will be stuck executing this command, and you won’t get a prompt to enter new commands.
Interactive Shell Issue: The shell remains technically interactive, but because the cat command doesn’t complete, you won’t get a chance to interact with it. The shell is effectively blocked by the cat command continuously running.
Potential Problems: There’s a possibility that /dev/pty23 might not exist on your system, or you might not have the permission to write to it. In such cases, the command would fail, but it would still block the shell if it doesn’t exit properly.
Fixing the Issue: To regain control of your shell, you might need to edit your .profile from a different context where it doesn’t get executed, like using a non-login shell or booting into a recovery mode.
In summary, it’s a kind of a “prank” command that can render your login shell unusable until you remove it from your .profile. It’s an example of how powerful shell startup scripts can be, and also a reminder to be cautious about what gets added to them!
I just tested it and unfortunately it did not fix the problem. Thing is, not Lutris nor Steam is picking up the gamepad. So I don’t see Steam eating the input, since it also doesn’t seem to recognize it.
Sorry, I misunderstood. What controller are you using? It seems odd that only jstest is detecting it. I initially needed to use an enviroment variable for my Steel series Stratus duo, but I think that was a layout issue.
I am really just using some very cheap off brand controller I found at some store. It does work on my RetroPie, but not my Debian pc. I have now switched over to my steam controller (since it works without any problems) and playing with it feels fine too.
So I guess I the problem has been solved for me, but the mystery of what causes this issue remains.
I’d suggest checking section 5.3 of the arch linux wiki gamepad page. Debian probably either has an older version of the related package or retroPie might have extra patches. Could not say what package for certain though. Arch Wiki Gamepad
They may have been, things were far more trusting back then.
X servers, for example, would accept any connections. So we would often “export DISPLAY=friendscomputer:0.0” in the computer lab and then open windows of embarrassing content. Which at the time would likely be ASCII art…
One of my favourite wars was to open audio files on other people’s SPARCs, somebody had the loudest bag pipe music that usually ended things.
Access to the SPARCs was normally restricted to third year but if you knew the right person you could get an account created pretty easily. Had the fastest access to the internet at the time within the uni as well.
I used to work at a company that did distributed QA. Other people’s tests would run on your desktop. It worked surprisingly well. But occasionally a test of some audio resource would play on your speakers “The discrete cosine is a real, discrete version of the fast Fourier transform.”
Honestly, this used to be the case, but the past couple of years Lenovo is going back to their old ways of sub-par upgradability, and sub-sub-par support across models for Linux. I believe the P-series is the current most compatible line.
You might want to consider getting a slightly older refurb you KNOW is very compatible versus a newer one, because it’s a crapshoot. Make sure to avoid any models with soldered memory (they specify on their site), and if you’re buying a modern AMD model, do some research and make sure they haven’t crippled any features in the BIOS.
If you’re not completely sold on Lenovo, look at getting a Framework laptop. It’s the most upgradable and repairable laptop of any kind out there.they also just started an outlet online store where they are selling last-gen models at deep discounts that you could upgrade to current Gen when the time comes.
Hey thanks, this outlet store thing may be just what I need! I wanted a framework but didn’t want to take out a second mortgage on my house! Lol that’s why I was considering Lenovo, black friday deals that I assume aren’t going to be on frameworks (but I’m still gonna check fri/mon) but are on the lenovos.
Correct. In the mission statement, Framework says they won’t be doing random sales, and prefer to keep prices consistent so customers know they are always getting the lowest price. I’m signed for an AMD 16", but those outlet prices are crazy good, so bought one of the 13" Intels as well to play with 😂
To be honest, Ubuntu likely has nothing to do with it and I find the headline therefore misleading. It’s mostly the Linux kernel from how it reads.
Ubuntu 23.10 was run for providing a clean, out-of-the-box look at this common desktop/workstation Linux distribution. Benchmarks of other Linux distributions will come in time in follow-up Phoronix articles. But for the most part the Ubuntu 23.10 performance should be largely similar to that of other modern Linux distributions with the exception of Intel’s Clear Linux that takes things to the extreme or those doing non-default tinkering to their Linux installations.
Proprietary snap store backend that is controlled by Canonical: that’s it.
I used Ubuntu for years: installed it for family and friends. I moved away around a year ago.
Moving packages like Firefox to snap was what first started annoying me.
If the backend was open source, and the community could have hosted their own (like how flatpak repositories can be), I might have been slightly more forgiving.
Did a quick Google to find if someone had elaborated, here’s a good one:
It is also a commercial distribution. If you ever used a community distribution like Arch, Gentoo or even Debian, then you will notice that they much more encourage participation. You can contribute your ideas and work without requiring to sign any CLAs.
Because Ubuntu wants to control/own parts of the system, they tend to, rather then contributing to existing solutions, create their own, often subpar, software, that requires CLAs. See upstart vs openrc or later systemd, Mir vs Wayland, which they both later adopted anyway, Unity vs Gnome, snap vs flatpak, microk8 vs k3s, bazar vs git or mercurial, … The NIH syndrom is pretty strong in Ubuntu. And even if Ubuntu came first with some of these solutions, the community had to create the alternative because they where controlling it.
Serving files over HTTPS is not difficult to implement If anyone cared. Even if the cloud backend was open source you still wouldn’t use it. Downvote now!
I’ll add one more grip: Amazon integration. It’s been resolved for like 7 years now, but I still hold it against them a bit for placing Amazon search results in my desktop all those years back. Not that I don’t have an Ubuntu server running as we speak, but it still does taint them a tad in my eyes (and probably acts as an anachronism to the “it’s a corporate distro” theme of dislike around here).
Ahh, okay, so nothing new under the sun: Hipsters hate normies and September never ended.
Although I’m under the impression that Mint and Pop have taken a bite out of the “beginner desktop” market, Ubuntu is most of what I observe in the office when everybody else is booting Windows.
I can understand selecting for novelty; I’m usually in that camp. But novelty shouldn’t come at the expense of an argument to IT departments that they should support at least one Linux distro.
My main issue with Wayland is the fragmentation. Abstract protocol which could be implemented by particular DE/WM means nothing to a user which now doesn’t have a guarantee that their tools will work under all environments. For example, some screengrab utility could work under Gnome, but will not work under wl-roots based WM just because the relevant protocol is not supported there. That’s a major drawback to me, we lose flexibility and kinda forced to use mainstream DEs where they have enough devcapacity to support most of the features from Wayland protocols. Contrary to X.Org where most of the functionality is implemented by server itself and protocol exposed to the clients is way simpler.
I have a T580 with nVidia graphics. Repairability is great. You can find a manual with step-by-step instructions for every part online.
But the thermals in that thing are awful. Especially on Linux and doubly so with the GPU. It has some stupid on-lap detection which heavily throttles the system to not burn the user. Up until a few years ago there wasn’t a driver for Linux so it always defaulted to on-lap-mode. But even worse, the GPU has some hardcoded 70° limit and it throttles down to the lowest clockrate when it reaches that. And it reaches that quickly because CPU and GPU share a heatpipe.
Nowadays I just run it on the integrated Intel graphics on Wayland and it’s great. But it would be cool if I could use the GPU that is at least theoretically able to run Doom 2016 at 30 fps. But practically it struggles with Quake 3.
It’s just a shame that you probably won’t know about these kinds of problems on a new laptop because people only notice them after a few months to years.
Go with T-Series not with E or L. Better quality of materials, comes with better warranty (you can get up to 5 Years), longer spare parts availability, and usually you can replace more by yourself.
True! T series or P series are much better made. I’d also advise heading over to Lenovo support site and checking the service manual for any machine you’re interested in, just to make sure that the features you may want to upgrade are upgradable.
I’ve noticed Lenovo doing a lot of SOC style systems ala Apple where your RAM is one and done. It’s mostly been on the thin/light segment but…
My biggest complaint has been the fact that they don’t put the USB C inputs on a daughter card. I don’t know what the cost savings is, but I literally had two machines that users had killed the USB on that spent close to 10 months waiting on parts for a warranty repair.
It’s supposed to be tuned more toward heavy workflows, such as rendering and CAD. It has support for more RAM (6TB) and quad SMP along with ReFS, and SMB Direct.
I only found out about it because we needed a beastly set up for combining lidar and drone aerials in Autodesk.
Is there some reason to think that running Windows 11 Pro for Workstations would have made a difference in a CPU benchmark? I’m not seeing anything obvious on the feature list for that version that would make that be the case.
linux
Newest
This magazine is from a federated server and may be incomplete. Browse more on the original instance.