science_memes

This magazine is from a federated server and may be incomplete. Browse more on the original instance.

Classy, in I am so connected to the earth rn.

Botany memes cause my stamens to dehisce. Keep it up!!

tdawg, in Mossie birbs!!

Crunchy moss

squiblet, in mantra that keeps me going through grad school
@squiblet@kbin.social avatar

Mashed potatoes > toxic solvents imo.

lurch, in Oopsie!

This reminds me of the last episode of “The Continental” of all things. A follow up of this side plot would be nice.

zero_spelled_with_an_ecks, in it's got the juice

You’re leaving out a whole sister

1847953620,

This just got way kinkier. And I’m into it.

Track_Shovel, in it's got the juice
@Track_Shovel@slrpnk.net avatar
1847953620,

the metaphorical corn’s hand

Kid_Thunder, (edited ) in abandonware empires

Alright I know this is going to get some hate and I fully support emulation and an overhaul of US copyright and patent law but the justmeremember's supportive post is just bad. This is the same bad practice that many organizations, especially manufacturing, have problems with. If the ~20 years of raw data is so important, then why is it sitting on decades passed end-of-life stuff?

If it is worth the investment, then why not invest in a way to convert the data into something less dependent on EOL software? There's lots of ways, cheap and not to do this.

But even worse, I bet there 'raw' data that's only a year old still sitting on those machines. I don't know if the 'lab guy' actually pulls a salary or not but maybe hire someone to begin trying to actually solve the problem instead of maintaining an eventual losing game?

In ~20 years they couldn't be cutting slivers from the budget to eventually invest in something that would perhaps 'reset the clock?'

At this point I wouldn't be surprised to find a post of them complaining about Excel being too slow and unstable because they've been using it as a database for ~20 years worth of data at this point either.

Hillock,

Because it's often not worth the investment. You would pay a shit ton for a one time conversion of data that is still accessible.

If the software became open source, because the company abandoned it, then that cost could potentially be brought down significantly.

You are also missing the parts where functional hardware loses support. Which is even worse in my opinion.

Kid_Thunder, (edited )

Because it's often not worth the investment. You would pay a shit ton for a one time conversion of data that is still accessible.

Still accessible for now and less likely to be accessible as the clock ticks and less likely that there is compatible hardware to replace.

If it isn't worth the investment, then what's the problem here? So what if the data is lost? It obviously isn't worth it.

If the software became open source, because the company abandoned it, then that cost could potentially be brought down significantly.

OK but that isn't a counter point to what I said. If the hardware never fails, there is no problem either. What does that matter? And who cares if it was FOSS (though I am a FOSS advocate). What if nobody maintains it?

It doesn't matter because these aren't the reality of the problems that this person is dealing with. Why not make some FOSS that takes care of the issue and runs on something that isn't on borrowed time and can endure not only hardware changes but operating system changes? That'd be relevant. It goes back to my point doesn't it? Why not hire this person.

Clean room reverse engineering has case law precedent that essentially make this low risk legally (certainly nil if the right's holder is defunct).

You are also missing the parts where functional hardware loses support. Which is even worse in my opinion.

I didn't miss the point. I even made the point of having at least ~20 years to plan for it in the budget. Also the hardware has already lost support or there wouldn't be an issue, would there? You could just keep sustaining it without relying on a diminishing supply.

Or are we talking about some hypothetical hardware that wasn't mentioned? I guess I would have missed that point since it was never made.

forrgott,

Ah. So…blame the victim. Cause apparently capitalism is, like, perfect or something.

The company selling the software arbitrarily created a problem for no reason other than greed. And yet, the ones not forking over more money are the problem.

Yeah, hard no from me on your entire argument, buddy.

Kid_Thunder, (edited )

I didn't say capitalism is perfect nor did I imply it.

So hypothetically let's say the vendor lost the rights to the software since it is abandonware -- great. I'd love it.

What changes for justmeremember's situation? Nothing changes.

I suppose your only issue here is that the software vendor or some entity should support it forever. OK, so why didn't they just choose a FOSS alternative or make one themselves? If not then, why not now? There is nothing that stops them from the latter other than time and effort. Even better, everyone else could benefit!

Does that make justmeremember just as culpable here or are they still the victim with no reasonable way to a solution?

I posted simply because this specific issue is much too common and also just as common is the failure to actually solve it regardless of the abandonware argument instead of stop-gapping and kicking it down the line until access to the data is gone forever.

grue,

I suppose your only issue here is that the software vendor or some entity should support it forever.

If no entity wants to take on support, they should be forced to release the source code to the Public Domain. Copyright is a social contract, not an entitlement – if you don’t hold up your end of the bargain of keeping it available, you deserve to lose it.

Kid_Thunder, (edited )

Well, I think a better solution would be to deliver all source code with the compiled software as well. I suppose that would extend to the operating system itself and the hope that there'd be enough motivation for skillful folks to maintain that OS and support for new hardware. Great, that would indeed solve the problem and is a potential outcome if digital rights are overhauled. This is something I fully support.

What is stopping them now from solving access to this data, even if it's in a proprietary format?

Really, again, I don't take issue with the abandonware argument but rather with the situation that I posted itself. Source code availability and the rights surrounding are only one part of the larger problem in the post.

Source code and the rights to it, aren't the root cause of the problem in the post that I was regarding. It could facilitate a solution, sure but given that there is at least ~20 years of data at risk currently, there was also ~20 years of potential labor hours to solve it. Yet, instead, they chose to 'solve' it in a terrible way. That is what I take issue with.

OhNoMoreLemmy,

This is really not a problem that’s fixed by open source.

The microscope will be controlled by a card that only plugs into 30 year old desktops. If you open source the drivers for it this only gives you the source code to drivers for Windows 95. These drivers will be incredibly hacky and hard coded and probably die if you install a service pack.

Having access to the source code doesn’t let you replace the entire stack because you’re still physically tied to old hardware, that is worse than a raspberry pi and even just making sure that you can update Windows is a feat of engineering.

grue,

At the very least, being able to read the source code gives you a Hell of a head start on writing a new driver for an appropriate OS (and by that I mean Linux, obviously). Saves a whole reverse-engineering step.

Also, the “a card that only plugs into 30 year old desktops” thing isn’t quite as insurmountable as you think.

I’m not saying creating an entire project to adapt the controller and software stack to modern systems would be cheap or easy, but it’s possible – and more to the point, seemingly less expensive than buying the new microscope for “hundreds of thousands of €” (especially in the long run, since the company is likely to pull the same shit over and over again), even if you’ve got to pay a gaggle of comp-e grad students to put it together for you.

OhNoMoreLemmy,

I mean the most upvoted answer in your link says it often is that insurmountable.

Basically, it’s a huge gamble and a substantial software engineering effort even when you know what you’re doing and source code is available.

It’s not surprising that biologists keep using old machines until they die.

flerp,

Because they’re a science research lab not a computer programming lab? Maybe I’m misunderstanding what you’re saying but they’re not the right people, nor in the right situation to be solving this problem.

Kid_Thunder, (edited )

It isn't necessarily a computer programming problem either. Rather it is an IT problem at least in part, one that the poster states is the primary job of his 'lab guy' -- to maintain two ancient Windows 95 computers specifically. That person must know enough to sustain the troubleshooting and replacement of the hardware and certainly at least the transfer of data from the own spinning hard drives. Why not instead put that technical expertise into actually solving the problem long-term? Why not just run both in qemu and use hardware passthru if required? At least then, you would rid yourself of the ticking time-bomb of hardware and its diminishing availability. That RAM that is no longer made isn't going to last forever. They don't even need to know much about how it all works. There are guides, even for Windows 95 available.

Perhaps there are other hurdles such as running something on ISA but even so, eventually it isn't going to matter. Primarily, it seems rather the hurdle is specifically the software and the data it facilitates though. Does it really have some sort of ancient hardware dependency? Maybe. But in all that time of this 'lab guy' who's main role is just these two machines must have some time to experiment and figure this out. The data must be copyable, even as a straight hard drive image even if it isn't a flat file (extremely doubtful but it doesn't matter). I mean the data is by the author's own emphasis CRITICAL.

If it is CRITICAL then why don't they give it that priority, even to the lone 'lab guy' that's acting IT?

Unless there's some big edge case here that just isn't simply said and there is something above and beyond simply just the software they speak about, I feel like I've put more effort into typing these responses than it would take to effectively solve the hardware on life support side of it. Solving the software dependency side? Depending on how the datasets are logically stored it may require a software developer but it also may not. However, simply virtualizing the environment would solve many, if not all, of these problems with minimal investment, especially to CRITICAL (their emphasis) data with ~20 years to figure it out. It would simply be a new computer and some sort of media to install Linux or *BSD on and perhaps a COTS converter if it is using something like an LPT interface or even a DB9/DE-9 D-Sub (though you can still find modern motherboards, cards or even laptops capable of supporting those but also certainly a cheap USB adapter as well).

Anyway, I'm just going to leave it at that, I think I've said a lot on the subject to numerous people and do not have much more to add other than this is most likely solvable and outside of severe edge cases, solvable without expert knowledge considering the timeframe.

Getawombatupya, (edited )

In a GxP environment with bespoke pharmaceutical equipment you are spending anywhere from 1-4000 collective labour hours and anywhere from 50k-250k for a control system upgrade, URS/TRS/SDS, Code risk assessment and review, and Qualification. To give you an idea, on a therapeutic manufacturing plant you’re looking at a handful of two inch binders for the end to end system.

You are also (and more importantly) taking your resources off BAU or revenue generating improvement work for this project. You have a validated and qualified system, and even if you are spending $10-20k for a $500 like for like IPC or control card, the cost benefits of another 5 years is worth it.

If your equipment is a medical device, such as a diagnostic microscope, add another few binders of paperwork and regulator sign off. There’s a reason the equipment is so expensive

If you get into the food industry, or general manufacturing the barriers to upgrade are much less. For your machine shop running floppy disks, it’s a case of the external cost would approach the cost of a new machine, and the existing machine is fine.

As a maintenance professional this is the sort of risk management we conduct on an ongoing basis.

ftbd,

Obviously the company is the bad guy here. But if the research data is so important, the lab should try to solve their problem instead of just praying that the 20 year old machine won’t fail.

EuroNutellaMan, (edited )
@EuroNutellaMan@lemmy.world avatar

I study in biotech and currently doing a traineeship in a university lab that likely operates in a similiar way, albeit we are way less expensive to operate and require a bit less precision and safety than medical stuff (so for them the problems here are exacerbated).

Instruments like the ones we use are super expensive (we’re talking in the order of hundreds of thousands of €), funding is not great, salaries are often laughable, the amount of data is huge and sometimes keeping it for many years is very important. On top of that most people here barely understand computer and software beyond whet they’ve used, which makes sense, they went to study biotech and environmental stuff not computer science. There’s an IT team in the university but honestly they barely renew the security certificates for the login pages for the university wifi so that’s laughable, and granted they’re likely underpaid, probably a result of low public funding as well. Sure, none of the problems would be too impacting if we had all the funding in the world and people who know what they’re doing, but that is not the case and that’s why we need regulations.

What you’re suggesting is treating the symptoms but not the disease. Making certain file formats compatible with other programs is not an easy undertaking and certainly not for people without IT experience. Software for tools this expensive should either be open source from the get-go or immediately open-sourced as soon as it’s abandoned or company goes bust because ain’t no way we can afford to just throw out a perfectly functioning and serviceable tool that costed us 100s of thousands of €s just because a company went bust or decided that “no you must buy a whole new instrument we won’t give you old software no more” in order to access the data they made incompatible with other stuff. Even with plenty of funding to workaround the issue that shouldn’t be necessary, it’s a waste of time and money just so a greedy company can make a few extra bucks.

Kid_Thunder, (edited )

So again and again and again, I was not arguing against the abandonware issue. I take issue with how the problem is being stop-gapped in this current situation and not in some hypothetical alternate timeline.

Instruments like the ones we use are super expensive

Great. I didn't imply otherwise.

On top of that most people here barely understand computer and software

So the lab guy maintaining Windows 95 era computer's hardware, barely understands computers. Got it. I suppose this same lab guy won't be able to do anything even if the source code was available and would still being doing the same job.

What you’re suggesting is treating the symptoms but not the disease. Making certain file formats compatible with other programs is not an easy undertaking and certainly not for people without IT experience.

I didn't say it isn't. I said they've had ~20 years to figure it out. What would source code being available solve for them then? We could assume other people would come together to maintain it, sure. I've also talked about other solutions in replies. There are even more solutions. I wasn't trying to cover all bases there. It is just that within a couple of decades this has been a problem, there has been plenty of time to solve it.

Software for tools this expensive should either be open source from the get-go or immediately open-sourced as soon as it’s abandoned or company goes bust

Oh OK, so that makes it less complicated. I thought the assumption here is that, in general, anyone in that lab barely understands a computer or how software works. So, who's going to maintain it? Hopefully, others, sure. I actually do talk about this in other replies and how it is something I support and that, in this case, the solution is to deliver the source with the product. FOSS is fantastic. Why can't that just be done now by these same interested parties? Or are we back to "can't computer" again? Then what good is the source code anyway?

But again, that's a "what-if things were different" which isn't what I was discussing. I was discussing this specific, real and fairly common issue of attempting to maintain EOL/EOSL hardware. It is a losing game and eventually, it just isn't going to work anymore.

Even with plenty of funding to workaround the issue that shouldn’t be necessary, it’s a waste of time and money just so a greedy company can make a few extra bucks.

Alright, the source code is available for this person. Let's just say that. What now?

What can be done right now, is fairly straight forward and there are numerous step-by-step guides. That's to virtualize the environment. There is also an option to use hardware passthru, if there is some unmentioned piece of equipment. This could be done with some old laptop or computer that you've probably tossed in the dumpster 10 years ago. The cost is likely just some labor. Perhaps that same lab guy can poke around or if they're at a university, have their department reach out to the Computer Science or other IT related teaching department and ask if there are any volunteers, even for undergrads. There are very likely students that would want to take it on, just because they want to figure it out and nothing else.

There may be an edge case where it won't work due to some embedded proprietary hardware but that's yet another hypothetical issue at stake which is to open source hardware. That's great. Who's going to make that work in a modern motherboard? The person that you've supposed can't do that because they barely understand a computer at all?

In this current reality, with the specific part of the post I am addressing, the solution currently of sustaining something ancient with diminishing supply is definitely not the answer. That is the point I was making. There is a potential of ~20 years of labor hours. There is a potential of ~20 years of portioning of budgets. And let's not forget, according to them, it is "CRITICAL" to their operations. Yet, it is maintained by a "lab guy" who may or may not have anything other than a basic understanding of computers using hardware that's no longer made and hoping to cannibalize, use second hand and find in bins somewhere.

If this "lab guy" isn't up to the task, then why are they entrusted with something so critical with nothing done about it in approximately two decades? If they are up to the task, then why isn't a solution with longevity and real risk mitigation being taken on? It is a short-sighted mentality to just kick it down the road over and over again plainly hoping something critical is never lost.

EuroNutellaMan,
@EuroNutellaMan@lemmy.world avatar

who’s going to maintain it?

If it’s open source someone who knows about software can do it so that we don’t have to. Doesn’t even need to be a guy in the lab since he could just maintain a github repo and we’d use his thing.

If this “lab guy” isn’t up to the task, then why are they entrusted with something so critical with nothing done about it in approximately two decades?

Cause the instrument is important and replacing it, aside from being a massive waste of a perfectly functioning instrument, costs hundreds of thousands if not millions of € that we can’t spend just because some company decided to be shit and some dude on Lemmy said we shouldn’t use stop-gap measures for a problem that’s completely artificial.

Kid_Thunder,

Cause the instrument is important and replacing it, aside from being a massive waste of a perfectly functioning instrument, costs hundreds of thousands if not millions of € that we can’t spend

Why would you need to replace the instrument? You only need to replace the computers' functions. Why does it need to cost anything other than some other old workstation tossed into an ewaste bin years ago?

some dude on Lemmy said we shouldn’t use stop-gap measures for a problem that’s completely artificial.

As opposed to some dude on Lemmy bemoaning that there just can't be solved without source even though I've given actual solutions available now and for little to no material cost?

You have admitted that you'd still have to rely on someone else's expertise and motivation in the hopes that they'd solve the problem for the lab, yet, in my opinion, you're just discarding solutions that I've presented as if they aren't solutions at all because, at least in one of your points, that they'd have to rely on someone else's expertise and motivation in the hopes that they'd solve the problem for the lab. Even then, as I said, they've had decades to figure it out and there exist step-by-step instructions already that are freely available to help them solve the problem or get them almost to the end, assuming, there is some proprietary hardware never mentioned.

Anyway, I don't really have anything else to add to the conversation. So you can have the last word, if you wish.

EuroNutellaMan, (edited )
@EuroNutellaMan@lemmy.world avatar

Why would you need to replace the instrument?

Because the company made it so it only works with its specific software. Sure maybe you could try and find a way to hack another software in it but that is significantly harder than the stop-gap measures or full-replacement. If you mess up you can end up breaking an extremely expensive tool, and, since funding is extremely limited (talking bare-minimum or even less sometimes), that means you won’t risk it.

As opposed to some dude on Lemmy bemoaning that there just can’t be solved without source even though I’ve given actual solutions available now and for little to no material cost?

Yeah well one Lemmy dude actually knows the situation and how things work around a lab and one doesn’t seem to understand. It isn’t “little to no cost” evidently or most of us sure as shit wouldn’t be dealing with stop-gap measures.

You have admitted that you’d still have to rely on someone else’s expertise and motivation in the hopes that they’d solve the problem for the lab

There would easily be a team of software engineers who would take on maintaining a lot of the abandonware software we use in a lab since there’s a lot of folks who still rely on that software that the company abandoned, including people who know about software more. The key difference you don’t understand is that if the source was open it wouldn’t be necessary to have an IT enthusiast in every single lab that needs it, you only need 1 or 2 to maintain a repo.

Even then, as I said, they’ve had decades to figure it out and there exist step-by-step instructions already that are freely available to help them solve the problem or get them almost to the end, assuming, there is some proprietary hardware never mentioned.

First of all, not all abandonware is decades old. Secondly, people are already using the stop-gap solutions that you’d find on the internet, like never connecting the computer to the internet and pray nothing breaks, for example.

troyunrau, in It's ok R, we still love you for diagrams.
@troyunrau@lemmy.ca avatar

Somewhere in a backroom, there’s a hamster named Julia. In a hamster ball.

Mbourgon,

And rolling around behind it is a smaller ball called M.

troyunrau,
@troyunrau@lemmy.ca avatar

Don’t forget the centipede crawling around in the sewer pipes named Fortran. We’ve all been trying to kill it for years and yet, somehow, it keeps going.

Mbourgon,

I honestly figured Fortran was still somewhere above M.

troyunrau,
@troyunrau@lemmy.ca avatar

Jokes aside, I encounter Fortran in the applied physics community still fairly often. And have never encountered M in a professional context.

finestnothing,

I loved Julia in my data science classes. Codes like python, runs like c. Can also use it with bash by piping values in

baseless_discourse, in I dunno, still might be aliens with this one.

I remember someone mentioned online that the reconstruction of animals are more complicated than just tracing the bone line.

I am very interested if some experts are willing to tell us more.

blackbrook,

They can get some idea from the bones of muscle attachment points and how strong of a muscle would have been attached.

agent_flounder, (edited )
@agent_flounder@lemmy.world avatar

Since none chimed in (in the past 6 minutes) , I, an idiot, will share what I think I know. When reconstructing the faces of people from a skull, either with clay or software, they model the various tissues–muscles, fat, skin, etc according to models based on samples. How they would do this for a creature that isn’t very like any current living creature I don’t know. It is probably educated guesswork?

I just read an article on this process for a neanderthal and in that particular instance they used data from humans since I guess it was close enough.

But, for example (referencing a recent meme) how do they know spinosaur had a sail and not a hump back and neck muscles like a buffalo?? Seriously though I’m sure they can tell which bones have attachment points, how much force they can withstand, etc.

Hillock, (edited )

As another idiot, there is a difference between tusks and teeth. They are different, tusks don't contain enamel for example and I think aliens could also determine this difference. It's rare for teeth to stick out like in the reconstruction.

They would also be able to determine that hippos can open their mouth extremely wide. Making it more likely for the long "fangs" to be at least partially covered and not exposed like the tusks of elephants.

lugal,

Often, dinosaurs are depicted with mouths showing their tooth. This is debated and more and more scientists think they had closed mouths, like most animals today.

Other than that, the proposition of fat is very hard to reconstruct. Reconstructing a hippo you would have other mammals in mind and reconstructing dinosaurs, scientists take reptiles but they could as well take birds so this is a big question.

For context: I’m an idiot too

msage,

I’ll be honest, I double-checked your username to make sure I’m not going to read about Undertaker at the end

snooggums,
@snooggums@kbin.social avatar

Over the last few decades there have been massive improvements on telling which bones have attachment points for muscles and hints at how strong the muscles are likely to be, but it takes a long time to replace all of the existing artwork with newer and more accurate artwork.

Even with improvements to the muscle structure, any part of the body that has fatty buildup like breasts would be missed without soft tissues being preserved. I am fairly certain that a hippos nose and lip area wouldn't have enough detail to reconstruct accurately. Heck, tyrannosaurs most likely had lips to cover their teeth, but that is based on other animals with similar teeth all having lips to protect the teeth from dryness and rot that doesn't apply to crocodiles who live in a very wet environment.

diseasedolm, in It's ok R, we still love you for diagrams.

Oh how I wish this was the data scientists I work with

zewu,

This post was sponsored by the Matlab gang

FuglyDuck, in I dunno, still might be aliens with this one.
@FuglyDuck@lemmy.world avatar

They might look cute and cuddly, but hippos are freaking mean. And they hold grudges longer than a snubbed karen-in-law

MonkderZweite,

Aren’t they the most deadly wild animal? (because people think they are like cattle and get too close)

FuglyDuck,
@FuglyDuck@lemmy.world avatar

I’m not sure most deadly animal is right, but they’re definitely top five. Mothers also supper protective of children and males are hyper territorial.

kamenlady,
@kamenlady@lemmy.world avatar

They are also faster than they look.

SubArcticTundra,
@SubArcticTundra@lemmy.ml avatar

I don’t get how they can have so much energy to move such a bulky body quickly. Aren’t they also herbivores?

AngryCommieKender,

They are, but they spend most of their time in the water, so they aren’t supporting their own weight.

Every once in a while one of them forgets they aren’t top of the food chain, and attempts to fight an elephant. That goes poorly for the hippo.

transientpunk,
@transientpunk@sh.itjust.works avatar

I mean, the alien reconstruction is like looking into the soul of the hippo

marcos,

Yes, it captures the essence of the animal perfectly right.

Klear, (edited )

The Alien Picture of Dorian Hippo.

mindbleach,

Mandalore: “I would believe a hippo has boss phases in real life.”

caesar_salad83, in trig

what’s a tanorange?

otarik, in peas nutz

“Other biologists” except Wallace!

hakase, in Patchyrogan vs. Patchyjones, tonight at 8. PPV Prime Time. Cage Match!!

Because they don’t have six inches of solid bone protecting their brains?

agent_flounder,
@agent_flounder@lemmy.world avatar

Speak for yourself

Nobsi, in 𓍊𓋼😿𓋼𓍊
@Nobsi@feddit.de avatar

Well… Is Fungi a cute widdle baby meow meow boo that i want to hug? Didnt think so. Fuck you Fungi.

flying_sheep,
@flying_sheep@lemmy.ml avatar

Not with that attitude

Noodle07,

You’re not very fungi

trashgirlfriend,

We will remember this when our mycelial network grows through your body

  • All
  • Subscribed
  • Moderated
  • Favorites
  • science_memes@mander.xyz
  • localhost
  • All magazines
  • Loading…
    Loading the web debug toolbar…
    Attempt #