Ok… What are you even talking about? Most fusion solutions use the same second stage for power generation as many other power generation solutions. Heating water to spin a turbine. That is the same thing as all coal, natural gas, petroleum, and solar thermal. In a roundabout way you could even say hydro is just generating power from heated water if you abstract it to include the rain cycle that moved the water behind the dam. There is literally 0 “weakening” that is needed to generate power from fusion under the current predominant paradigms that are being researched. Tokomaks and inertial fusion both generate fusion and bleed the excess heat off to boil water. The only method with promise that does not use this method compresses colliding superheated plasma vortex rings in a strong magnetic field to induce fusion causing the plasma’s magnetic field to ramp up and push back against the containment field. The flux is captured directly into an electrical current that is shunted into a capacitor bank so it can be slowly discharged into the grid. This last method is the only one that has the potential to overload the grid if some sort of runaway event happens, though I don’t see how it would happen as every stage of it is reasonably confined by well-known physics.
The feds give the states more than $16b per year to build and run shitty, custom made IT systems for their Medicaid programs. It’s basically a subsidy to IT companies. There are thousands of examples like this, where spending money on fundamental science is clearly a better investment.
Fun fact, they were going to build one in the US crossing the borders of LA, TX, AR. They even dug out the damn hole, but they shit canned the whole project so now we’re just left with a random giant circular hole underground.
Edited AK to AR. That would have been a bit excessive.
Alas, I don’t think he will much care to build a subway-but-shitty between one farm outside Waxahachie, TX, to another farm outside of Waxahachie, TX. Not enough density of mouthbreathing Elon stans there.
Thanks for clearing that up, I thought I was finished or near completion. Glad they decided to stop production when they did but sucks that we didn’t get it.
Zounds, a collider over 3000 miles wide would have been quite the achievement! Here’s hoping they get back to it; that’s gotta be worth a ton of science points.
Kelvin mostly seems to be used to measure unimaginably hot (like ovens, metal forges, stars) or unimaginably cold (e.g. planets beyond Mars) things, Fahrenheit still exists only because the US Congress was lazy (though as an American I do find it somewhat useful for comparing weather and Earth’s climate zones in finer detail than just -1 in winter and 28 in summer), and I’ve never heard of that last one.
The problem is that you’d need the quarter-million dollar electron microscope to test your reverse-engineered modern version, and if you get something wrong and you fry it…
That being said, I wonder why labs don’t just make a VM. Hardware passthru is definitely a thing, parallel port cards exist (as do serial port) and you can back up a VM to whatever modern storage you want. Maybe the problem is proprietary cards/connectors? PCI-X or older?
I had to reverse-engineer a floppy disk encryption scheme that was performed by some DOS software that directly talked to the IDE floppy controller. There’s no emulating that. A USB floppy drive can’t even be operated in the same way.
It was easier to just crack the (admittedly trivial) encryption.
You can get PCIe to PCI cards. I think PCIe is pretty much backwards compatible with PCI and only little logic is required. And PCI-X cards do work in PCI slots at reduced bandwidth.
Tho, if a system works without issue, why touch it? Only if parts become hard or expensive to come by a replacement makes sense.
I'm sure some do, but there's also a certain simplicity to "back up the Win95 machine" and "collect working Pentium 2's from eBay," particularly for fields that are not interested in IT for its own sake. A virtual machine adds an extra layer of abstraction and complexity, though I'm sure there's a slow trickle as entities have trouble replacing hardware or luck into technically savvy and ambitious staff. I've certainly seen my share of data being entered into a Windows 10 app that sure as shit seems to be a terminal emulator running some green-text dinosaur, or else it's got a set of Visual Basic widgets that seem like they'd be compatible with one.
The rant in the post has some merit to it, but the thing it sort of misses is also the reason not to use VM. It works just fine. It hasn’t been updated in 20 years because it still works. It does what it says on the box. Why put it in a VM? What would you gain from it? If you need Internet just grab a laptop and have it sit next to the main computer. That way users have a much smaller chance to break something vital. Pretty much all the control computers are air gapped anyway. No updates or anything to break things you reeeeally don’t want to break.
The only case I’ve seen VMs being used is if the old computer breakes and you can’t really find something that’s compatible with old-as-fuck software om bare metal. I work in a cleanroom and we got sooo many systems that are windows 95 or older (DOS anyone?). Electron microscope, etching systems, probe stations
The merit is security, as you can manage what goes into the VM as oppose to having the hardware where people can just plugin a flash drive or network cable.
Then there is also the improvement to not needing to maintain the old hardware, and having a backup of the entire system that you can just copy to a different system and have everything running again.
Sure I can see it being a security feature, random USBs are not a good thing, but I feel like it is quite minor with an air gapped system, no?
The backup is a good point. Though from this I started wondering how difficult it is to get the VM to communicate with old hardware. Like, the hardware might use some random method of actually communicating with the computer, ans getting that through to the VM might be problematic? I have no clue, just spitballing here.
piracy isn’t theft, but how do you feel about “stealing” from a thief? in the case of corporate software, the company already stole the surplus value created by their developers’ labor.
Publishers and film makers too. Keep it in print or lose rights (though I’d rather have much shorter copyright periods). Changed products get their own copyright, but the old version falls out if you stop selling it.
But look how fast we can make those little fuckers go!
It’s just like slot car racing, round and round, but… you know… faster. And yeah, it’s more expensive than a regular slot car track, I guess. But still, those particles will beat any slot car you care to pick! So there’s that. Welllll not those fancy slot cars with them high performance motors, I mean, that’s a completely different ballgame there, we can’t compete with that.
But still, those particles whizzing around, it’s gonna be pretty cool. I reckon we should do it.
So anyway, thank you for reading my financial proposal for the SuperLHC.
Not only does most scientific instrument software become abandonware, but there are companies that sell instruments that use the exact same components as they did 20 years ago. The only difference is now they swapped the stainless steel parts for plastic and charge luxury car prices for what will be a piece of garbage in 3 years. These pieces have nothing to do with chemical compatibility and everything to do with increasing the frequency of maintenance that the older models never needed.
I’ve seen things you people wouldn’t believe. Critical government services running COBOL. Programs stored in magnetic tapes, entire offices dependant on one guy who’s retiring. All that code will be lost in time, like tears in rain
There is genuine money to be made in learning the “dead languages” of the IT world. If you’re the only person within 500 Miles that knows how to maintain COBOL you can basically name your price when it comes to salary.
I just wish I had the slightest interest in programing
I’ve seriously looked into picking one of these dead languages up and honestly, it’s not worth it.
Biggest issue is, you have to be experienced to some degree before you get the name your price levels. So you’ll have to take regular ol average programmer pay (at best) for a language that’s a nightmare in 2023. Your sanity is at heavy risk.
I’d honestly rather bash my head with assembly, it’s still very much in use these days in a modern way. Most programs still get compiled into it anyway (Albeit to a far more complicated instruction set than in the past) and can still land some well paid positions for not a whole lot of experience (relatively)
I’ve been meaning to learn Fortran in part because because of the whole “big bucks for being willing to maintain old software” thing, but mostly because I’d like to work on the sorts of scientific computing software that was (and still often is) written in Fortran.
Yeah man I’ll take plain old php and java any time of day, I can still get enough money from it to pay my lifestyle. And at 5pm I can close my laptop and play vidya with no worries.
COBOL isn’t too terrible, it has its gotchas (like sizing variables for inputs (in which you don’t need space for the datas headers and will break stuff if you do)) but mostly it’s an old language designed to be easy to use
New staff in my workplace first using COBOL (with other build experience) learn it to the point they’re productive in a week or two
This is one of those fantasies people have. You might as well hope to win the lottery.
Imagine being the only person who can play a extremely custom instrument. Unless someone absolutely needs you, you’ll be sitting and hoping to get a job. Worse, a company is more likely to hire some people to rebuild it rather than hope to find this unicorn who can do this.
Source: Been in the industry for 15yrs. I’m one of those guys you hire to migrate old software to a web app. And frequently, company will pay to modernize rather than support outdated tech every time.
COBOL case is bit different. You can’t just modernize millions of lines of code that is functionally unique without service disruption - and services that uses COBOL that large often tends to be very sensitive.
The fact that COBOL as a language is both atrocity to either use or read didn’t help that either.
Been in the industry for 10 years and i deeply disagree with you. I work in COBOL.
Not that migrations don’t happen, but in my experience, many, many companies kick that can down the road each year, because migrating huge and critical services is extremely costly, time-consuming and risky. In the short term, just paying people to maintain the dinosaurs is waaaay cheaper.
Also, it’s extremely easy to get a job in it ( my company now hires people with no IT background and tries to teach them cobol from scratch ), because even though it’s a niche, the demand for it still outweighs the supply of people willing to learn it.
Will it die out eventually? Maybe. I’ve been hearing about its death for a decade, so i’ve become skeptical about it in the short-term.
Edit : would also like to point out that it is indeed a fantasy that it pays truckloads of money. Does it happen? Sometimes, but you need to be really good and experienced at it.
I’d rather not dox myself, but i can tell you i’m in eastern europe working for a western european bank. COBOL is still heavily used in the banking and insurance sectors, by companies that started using it 50 years ago.
If you do manage to learn the ropes, the salary does tend to be above average for a mid-level programmer.
There is some logic to running older stuff, a lot of it is a closed system and it’s harder for threats to target it. Banks are a big one that still run a ton of our financial infrastructure on COBOL.
Hospitals also run on a ton of abandon ware, same with machine shops. Ultrasound machines that are still running 95 because for the hospital to upgrade to windows 7 or 10 is millions for a few machines. So you just airgap the systems for security.
Unfortunately, they still have parts that fail, the good news is most of its being replaced with new old stock, so not technically new stuff. I know a good number of companies that have stock piles of basically museum level hardware, to replace failing parts.
Kinda related, in the company I used to work everything was done in SAS, an statistical analysis software (SAS duh) that fucking sucks. It’s used to be great, but once your on their environment you are trapped for fucking forever. I hated it and refuse to learned it over what was basic for my daily tasks. A couple of months I moved to another company that used to pay a consulting firm for my job, so my boss and me had to start everything fresh and the first thing we did was to study what are going to use as statistics software and I fight tooth and nails for Python and one of the points I pushed was that if in the future we decide to move out of Python we could easily can do it, while other solutions could locked up us with them.
If you rely on free packages in Python for processing, those are as likely to become obsolete as anything else (if not more likely). I also really dislike the compatibility issues with different versions of different packages, the whole environment aspect. Buying new computer with different version of windows? Who knows what will work there.
In this sense for scientific computation I prefer something like MATLAB. Code written 40 years ago, most likely would still work. New computer? No problem, no configuration, just install Matlab, and it runs! Yes, it costs money, but you get what you paid for. Mathematica is another option, but I mean ugh!
I mostly use pandas that I don’t think is going anywhere, we’re also going to start tests with a library called ‘chainladder’ that is used for some actuarial reserves calculations, from everything else I’m programming custom functions because as far as I know, there’s not a lot of actuarial mathematics libraries on Python (R have much more support for that, but I prefer the flexibility of Python, like a good portion of my job is scrapping our regulatory body website for information and not sure how good R work on that).
Matlab is ugly because it’s so backwards compatible. And it only is backwards compatible until someone decides to use it to interface with external hardware that you need a specific version of some library for.
If you really don’t want to spend money, there’s always GNU Octave. Sure, it doesn’t have the thousands of matlab toolboxes, but if you’re running code from 40 years ago it shouldn’t need those anyway. I wrote a couple of scripts recently and then rewrote them slightly so that they would be compatible with octave.
science_memes
Oldest
This magazine is from a federated server and may be incomplete. Browse more on the original instance.