Comment by dleeftink
Comment by dleeftink 16 hours ago
I agree, but want to add that while we may perceive other creative works as 'finished' (to an extent), code often is not. It unfortunately, needs perpetual work.
Comment by dleeftink 16 hours ago
I agree, but want to add that while we may perceive other creative works as 'finished' (to an extent), code often is not. It unfortunately, needs perpetual work.
It's the same sort of thing, parts obsolescence. The world around you changes and the interfaces you used to interact with the world may not exist anymore. Your dependencies may have been abandoned and have gone through their own bitrot.
I think the best defence is to choose a platform that has longevity, like x86 Linux, and then seriously limit dependencies beyond the platform to the point of maybe not having any extra dependencies.
The problem is eventually platforms change too. The longest lasting platform for software ever created is x86 + BIOS which lasted from 1981 to 2020 when Intel officially deprecated BIOS.
The biggest factor is dependencies' changes, so a good defense against bitrot is to reduce the dependencies as much as possible and try to limit dependencies to those which are exceptionally stable.
This greatly limits velocity, though, and still doesn't help against security issues that need patching.. or if any of the stable dependencies made certain assumptions about hardware that has since changed. But, with the right selection of dependencies and some attention to good design, it is possible to write code durable against bitrot. It's just very uncommon.
Think you need to go with a "dead" language with a simple runtime and keep everything vendored. My top contender would be Lua 5.1. Simple runtime (~20k(?) lines of C) which has been implemented in many other platforms (Javascript, Go, Rust). The side benefit of 5.1 is while you can make the standard compiler your target, you can probably run on LuaJIT (not dead, sophisticated assembly, and potential for breaking changes) as well.
Try to run it on a 20-year-old system.
If it ran 20 years ago and it still runs now, it's very likely to still run in another 20 years.
Great question. It really depends. 10 years isn’t very long, so most well known languages & platforms will be fine. But 20 or 30 or 50 or 100 years, that gets more interesting.
I’ve kept all my dumb little side projects for my entire life, starting from Basic, Pascal & x86 assembly as a teenager 30 or more years ago, lots of C++ and OpenGL in college, python over the last 15 years, and HTML+Javascript mostly from ~10 years ago.
Surprisingly, the stuff that still runs with the least trouble from my teenage years several decades ago is the assembly code. Basic and Pascal I can do with emulators, but it takes more work. My C++ OpenGL projects from 15-25 years ago take some work to resurrect, but can be done. C++ command line code from 25 years ago compiles and runs without issues. Python from 15 years ago still runs, even the python 2.x code. HTML+JS from 10 years ago still runs without changes. My Arduino projects from 10 years ago might have bit rotted the most; they almost require starting over to get them going again.
Ironically even though the JS ecosystem has had some of the highest churn, I feel like it’s one of the safer bets, as long as you keep dependencies down. Don’t pull a ton of crap from npm/yarn/whatever. Use mostly vanilla JS+HTML, and it will definitely run on almost any OS and mobile device 10 years from now.
Anything with standards behind it necessarily moves pretty slowly. What C++ looks like is changing over time, but old code is pretty safe most of the time, and code written today should continue to work for 10 years easily.
One big thing is just losing knowledge of why things were done a certain way and how they actually work.
Documentation helps and keeping code simple helps.
But what really what rots away is human memory.
There is also the coopling issue: when your code depend of another part of your own code, it may be broken by this inner dependecy. If the code is not intégration tested enough, then rarely used features may be broken without you noticing, thus the roting expression. Modern standards help protect against this with the test pyramid.
ABI compatibility is one of several components involved. The OS the software runs out plays a small role in this problem.
There is a relevant point about OSes though, and it has a different conclusion from yours: Write our software (And OSes) in a way that doesn't create barriers and friction between systems.
It has been my understanding that video games do not patch libraries. Pick a version that is available today and use it forever.
And more common nowadays, to re-release/master the recording as 'the artist intended'. But once you are familiarised with an original work and its (unintended) artefacts, a re-do is likely to lose some of the initial magic that drew you to the work in the first place.
It's pretty wild to me (I do hardware) that data goods like code can rot the way they do. If my electronics designs sit for a couple years, they'll need changes to deal with parts obsolescence etc. if you want to make new units.
If you did want your software project to run the same as today when compiled/interpreted 10 years from now, what would you have to reach for to make it 'rot-resistant'?