You don’t need any internet connection to install Ubuntu. Just use the normal install, not minimal network installer. Install from a USB stick.
Also, there’s no requirement for a wire either. If that were the case, you could never install on any modern laptop.
You would need some sort of functioning network to upgrade packages or install anything not in the base image, but this would all be after installation when you have a working OS and wired or wireless won’t matter.
Ah okay. I just remember hearing that all your drivers need to be manually installed and updated in Linux, so for me that included ALL drivers, even basic ones like that. If I can get started wirelessly that would be perfect. Thanks!
It’s almost completely the opposite, drivers are (almost completely) a windows problem. If you’re willing and able to go the open source route, which for most people mean “I don’t have an NVIDIA card or don’t plan on getting every ounce of performance from it” you don’t need to worry about drivers at all (bar some weird cards, but they’re getting rarer and rarer, I don’t remember the last time I had to install a driver that wasn’t NVIDIA).
Good to know I should avoid NVIDIA for Linux. The only NVIDIA card I have is on my gaming rig, so I don’t plan on having to deal with that since I’m sticking with Windows on that until (hopefully) more studios start caring about Linux compatibility. Can’t wait to cut that Microsoft umbilical cord permanently.
That said, do I need dedicated graphics on a Plex server? I was going to go integrated, but your comment made me realize I never checked hardware requirements. Which are probably on Plex’s website. Which I am now going to go check because Lemmy isn’t Google and it’s not your responsibility to hand me answers I can easily find.
Just to avoid any misunderstandings for the furture: you can run NVIDIA cards in ubuntu, you just have to install their proprietary driver. And on ubuntu, its pretty easy to do so. I used a few different nvidia cards on Ubuntu in the last years and never experienced any issues after installing the recommended driver. Before installing the driver, I got some flickering and artifacts, but with the right driver everything should be fine. And even for amd graphics you can install the proprietary drivers from their website to get out the maximum performance of the GPU. But for amd you can also use the “pre-installed” open-source driver, which has a much better performance in comparison to the open source driver for nvidia cards. Integrated grahipcs are supported out of the box in almost any cases.
Even Nvidia video works out of the box without any additional drivers.
The thing with Nvidia is that although the default drivers work, they are more generic and don’t take advantage of all of the features and performance of recent cards. Most people would want to load the proprietary drivers from Nvidia to take full advantage of the card.
Linux would normally include the better drivers, but Nvidia keeps them under a software license that prevents Linux distributions from bundling them.
Even with this, Ubuntu includes a tool that will download and install these drivers that they can’t
I think I wasn’t clear, for NVIDIAs you need to take some action, on some distros is ticking one box during installation, on others is installing the driver afterwards, but they work, all of my current computers are NVIDIA. Even without installing the proprietary drivers NVIDIA cards work fine for 90% of things, the problem is that gaming will have less performance and you wouldn’t be able use CUDA.
I know you’re googling it, but in any case AFAIK Plex can run on integrated cards, most cards can decode video nowadays so it shouldn’t be particularly hard. If you’re looking into using Plex I recommend checking Jellyfin, it’s an open source alternative, I’ve been using it for years and have nothing to complain about.
I think i just misunderstood how “DIY” Linux was and thought it came with essentially no drivers. I thought it was kind of like rooting an Android, you get more control in exchange for having to do everything yourself. I mistakenly lumped all drivers under “everything”
More exotic software will probably come from the internet, but the basics should be on the DVD. Good luck with your journey, reach out if you need any help, im sure everyone here would be happy to assist.
I would suspect that making a stable desktop inside docker ensures it would work everywhere else, no matter what the hw/sw of the host is.
I've only known docker as a building environment that ensures rebuildability and I can't say I ever liked it. I think its popularity comes from some myth of safety and security.
I couldn’t possibly care less about Docker Desktop. Portainer is a much better solution when graphical administration becomes necessary. (Which should be never)
Unity has been getting better press because they mildly walked back a few of their policies. One prominent gamedev channel i saw (games from scratch i think?) did a video praising them for booting out ironsource execs (adware company unity bought a while back) from the company.
And, like clockwork, Unity proves that it was never the plucky underdog that was going to take on the behemoths of unreal and (at the time of inception) cryengine. In fact, it feels like its more hostile to its users than either of its original competitors, that were once known for hostile and expensive features.
And again, im gonna shill for godot. You’re better off using FOSS for your tech stack primarily because of this kind of arbitrary behaviour that becomes standard once you’re too big to be internally accountable.
I know there are a lot of Godot tutorials out there, wondering if there are any you would specifically recommend though? I’ve got a lot of Unity experience but looking to move my personal projects to Godot
I can vouch for Clear Code, as well. That’s where I started and learned to build some 2d platforming games. If you want to get into 3d right away, there is a channel called BornCG that has a very good series on building simple 3d platformer games, too.
If I run these as an unprivileged user via xhost, they don’t really work well.
This is not a strong security boundary and in this case is basically doing the opposite of what you want. Giving access to an X session is basically giving the app full access to your user account. As an example they can inject keystrokes to open a terminal and do whatever they want. X also gives every program access to every other program.
Running as a different user will prevent direct access to other resources of your user account which may block some generic malware/spyware that tries to gobble up random files, but keyloggers and screen captures will just work as expected because they use X anyways.
As mentioned in other comments the best solution to this is Wayland. Under Wayland apps don’t have direct access to each other. These apps use “Portals” which are trusted permission prompts. So if you try to share the screen under Wayland you will get a trusted prompt that list all windows, and if you select one the app only gets access to that one selected window.
Although it is worth noting that most apps running under your user account will have pretty broad access. This can be mitigated by sandboxing tools like Flatpak but many available Flatpaks don’t provide much isolation. Carefully check the permissions if isolation is important to you.
And for the truly paranoid anything running under the same kernel is not strongly isolated. It is likely good enough for these partially trusted apps like Zoom or Teams (they are not likely to actually try to exploit your system, just suck up more data than you would like them to) but not strong enough for running completely untrusted programs that may be malicious. You would at least want a VM boundary (see Qubes OS) or ideally different physical hardware.
Another good option is running these in a browser. Browsers are designed from the ground up to run untrusted software safely. Google Meet works perfectly in the browser and Zoom has all of the core functionality available. (I don’t use MS Teams so can’t vouch for it.) This is my main approach to isolating proprietary software as it is reliable and I also value features such as cross-platform usage. Half of these programs just run Electron anyways so running in my main browser will use less resources and be faster than running 7 different Chromium processes.
So wayland fixes most of these. Is it possible to run GUI programs as another user just like in X with xhost though ? I’m asking not only from a security point, but as a practical one since I need to run the same program under different namespaces/users
I can’t way I have tried. But Wayland uses a socket, so many you can set file permissions to let other users access it?
I don’t know what your exact use case is but if you just want programs to have different “profiles” you can probably do something like setting $HOME to point somewhere else or otherwise configure their data directory.
Probably not the ideal method, but I’ve used a virtual machine with the disk connected via USB and then mounted to the VM to achieve something like this. It doesn’t interfere with the existing disks or UEFI of any actual hardware then.
The problem that could occur is: Right now Microsoft doesn’t care about Linux or competitors, every OEM has to buy a Windows key anyways regardless. If SteamOS actually becomes a shippable option, Microsoft’s cavalier attitude is going to change quickly, and a lot sooner than it will take them to get an Xbox Handheld out the door.
I don’t understand how that’s a problem. Can you go into a little bit more detail about what you think the consequences might be to manufacturers choosing to use Steam OS or some other Linux operating system on their handheld devices?
One does not need to be a fan/recurrent viewer of LTT to be curious about a technology. And while most of the technical information sucks, the introductory level stuff can be useful for low and middle-end enthusiasts.
Can you answer the question raised by my post?, or provide an alternate source(perhaps an article or coverage by a different channel) for the technology discussed?
Don’t a lot of CPUs like Snapdragons already have “performance cores” and “efficiency cores” that the kernel has to be able to recognize in order to switch between them? This sounds neat but I’m just curious what’s different between these situations.
The only difference is the hardware. Intel has their own version that has been in the kernel for a long time. Amd has been struggling with landing the concept.
Even Intel has these. I think this patch set goes a bit further and takes into account the silicon lottery differences between cores (according to the patch series)
I’m using the patch set on my framework 7840u and didn’t notice a difference though, though this is really YMMV.
Did you do benchmarks? It probably doesn’t help much for heavily multi threaded apps, as they should use all cores anyway. And most apps aren’t performance critical, altough it might stabilize fps in games.
Yes you can do that, make sure you are on the same CPU infrastructure (ie, don't try to install linux on an SSD from an intel laptop if you're going to be running it on an arm based processor or something).
Yes, just make sure that the boot setup for the distro install is compatible with what you intend to install it onto (I.E. if your server is going to be using EFI to boot an OS, install your Ubuntu instance as GPT, EFI onto the SSD). Depending on what wireless modules you are using and where you are sourcing them and how you are installing them, you might need to ensure Secure Boot is disabled in the BIOS of your server. This will be the case if the kernel module package you are installing doesn’t sign the wireless adapter driver you intend to use. Otherwise, most drivers you could possibly need should be baked into the kernel and you should be good to go.
(One further sidenote coming from someone who has not used Ubuntu in a long time (since 16.04’s release), it would be good to check in the /etc/fstab file that the filesystem references are using either UUID or PARTUUID. Depending upon the drive layout of the server you are mounting the intended drive into, traditionally labeled references such as sda or nvme0n1 can change depending upon the slots each drive is seated. Using UUID or PARTUUID in the fstab reference alleviates any potential complications from this scenario where fstab might reference the wrong drive in mounting partitions. I do believe Ubuntu would likely do this by default nowadays, but it can’t hurt to check.)
Thanks for all the info. I have no comment since I need to watch like 3 youtube videos and spend another hour reading before I really understand that second paragraph, but I will definitely be referring back to it.
What I did pick up was that the kernel actually comes with basic hardware drivers, which is a huge relief. I have pretty standard wifi hardware on standby, so I can try that.
linux
Active
This magazine is from a federated server and may be incomplete. Browse more on the original instance.