Well, I think a better solution would be to deliver all source code with the compiled software as well. I suppose that would extend to the operating system itself and the hope that there'd be enough motivation for skillful folks to maintain that OS and support for new hardware. Great, that would indeed solve the problem and is a potential outcome if digital rights are overhauled. This is something I fully support.
What is stopping them now from solving access to this data, even if it's in a proprietary format?
Really, again, I don't take issue with the abandonware argument but rather with the situation that I posted itself. Source code availability and the rights surrounding are only one part of the larger problem in the post.
Source code and the rights to it, aren't the root cause of the problem in the post that I was regarding. It could facilitate a solution, sure but given that there is at least ~20 years of data at risk currently, there was also ~20 years of potential labor hours to solve it. Yet, instead, they chose to 'solve' it in a terrible way. That is what I take issue with.
I didn't say capitalism is perfect nor did I imply it.
So hypothetically let's say the vendor lost the rights to the software since it is abandonware -- great. I'd love it.
What changes for justmeremember's situation? Nothing changes.
I suppose your only issue here is that the software vendor or some entity should support it forever. OK, so why didn't they just choose a FOSS alternative or make one themselves? If not then, why not now? There is nothing that stops them from the latter other than time and effort. Even better, everyone else could benefit!
Does that make justmeremember just as culpable here or are they still the victim with no reasonable way to a solution?
I posted simply because this specific issue is much too common and also just as common is the failure to actually solve it regardless of the abandonware argument instead of stop-gapping and kicking it down the line until access to the data is gone forever.
Because it's often not worth the investment. You would pay a shit ton for a one time conversion of data that is still accessible.
Still accessible for now and less likely to be accessible as the clock ticks and less likely that there is compatible hardware to replace.
If it isn't worth the investment, then what's the problem here? So what if the data is lost? It obviously isn't worth it.
If the software became open source, because the company abandoned it, then that cost could potentially be brought down significantly.
OK but that isn't a counter point to what I said. If the hardware never fails, there is no problem either. What does that matter? And who cares if it was FOSS (though I am a FOSS advocate). What if nobody maintains it?
It doesn't matter because these aren't the reality of the problems that this person is dealing with. Why not make some FOSS that takes care of the issue and runs on something that isn't on borrowed time and can endure not only hardware changes but operating system changes? That'd be relevant. It goes back to my point doesn't it? Why not hire this person.
Clean room reverse engineering has case law precedent that essentially make this low risk legally (certainly nil if the right's holder is defunct).
You are also missing the parts where functional hardware loses support. Which is even worse in my opinion.
I didn't miss the point. I even made the point of having at least ~20 years to plan for it in the budget. Also the hardware has already lost support or there wouldn't be an issue, would there? You could just keep sustaining it without relying on a diminishing supply.
Or are we talking about some hypothetical hardware that wasn't mentioned? I guess I would have missed that point since it was never made.
Alright I know this is going to get some hate and I fully support emulation and an overhaul of US copyright and patent law but the justmeremember's supportive post is just bad. This is the same bad practice that many organizations, especially manufacturing, have problems with. If the ~20 years of raw data is so important, then why is it sitting on decades passed end-of-life stuff?
If it is worth the investment, then why not invest in a way to convert the data into something less dependent on EOL software? There's lots of ways, cheap and not to do this.
But even worse, I bet there 'raw' data that's only a year old still sitting on those machines. I don't know if the 'lab guy' actually pulls a salary or not but maybe hire someone to begin trying to actually solve the problem instead of maintaining an eventual losing game?
In ~20 years they couldn't be cutting slivers from the budget to eventually invest in something that would perhaps 'reset the clock?'
At this point I wouldn't be surprised to find a post of them complaining about Excel being too slow and unstable because they've been using it as a database for ~20 years worth of data at this point either.