For instance, even if you have an old Intel integrated GPU, chances are you can still benefit from AMD’s FSR just by pushing a few flags to Proton GE, even if the game doesn’t officially support it, and you’ll literally get a free FPS boost (tested it for fun and can confirm on an Intel UHD Graphics 620).
Congrats! Your laptop will be even happier with a lighter but still nice-looking desktop environment like Xfce and you even have an Ubuntu flavor around it: Xubuntu.
HIP is amazing. For everyone saying “nah it can’t be the same, CUDA rulez”, just try it, it works on NVidia GPUs too (there are basically macros and stuff that remap everything to CUDA API calls) so if you code for HIP you’re basically targetting at least two GPU vendors. ROCm is the only framework that allows me to do GPGPU programming in CUDA style on a thin laptop sporting an AMD APU while still enjoying 6 to 8 hours of battery life when I don’t do GPU stuff. With CUDA, in terms of mobility, the only choices you get are a beefy and expensive gaming laptop with a pathetic battery life and heating issues, or a light laptop + SSHing into a server with an NVidia GPU.
It depends. I’m working in the quant department of a bank and we work on pricing libraries that the traders then use. Since traders often use Excel and expect add-ins, we have a mostly Windows environment. Our head of CI, a huge Windows and Powershell fan, once then decided to add a few servers with Linux (RHEL) on them to have automated Valgrind checks and gcc/clang builds there to continuously test our builds for warnings, undefined behavior (gcc with O3 does catch a few of them) and stuff.
I thought cool, at least Linux is making it into this department. Then I logged into one of those servers.
The fucker didn’t like the default file system hierarchy and did stuff like /Applications and `/Temp’ and is installing programs by manually downloading binaries and extracting them there.
When I was doing my applied math PhD, the vast majority of people in my discipline used either “machine learning”, “statistical learning”, “deep learning”, but almost never “AI” (at least not in a paper or a conference). Once I finished my PhD and took on my first quant job at a bank, management insisted that I should use the word AI more in my communications. I make a neural network that simply interpolates between prices? That’s AI.
The point is that top management and shareholders don’t want the accurate terminology, they want to hear that you’re implementing AI and that the company is investing in it, because that’s what pumps the company’s stock as long as we’re in the current AI bubble.
MobileTechReview (www.youtube.com/ ), Lisa’s reviews feel the most authentic to me without too much bullshit and they always helped me with my buying decisions.