Comment by RedShift1

Comment by RedShift1 2 days ago

6 replies

I hate compiling. 9/10 something goes wrong. Sometimes I can fix it, other times I just abandon the effort and use something else. These days I just use packages or docker images and if that doesn't work out I'm moving on, ain't nobody got time for this. You really can't expect people who just want to use their computers and don't even know what a compiler is to get involved in a process like that.

ses1984 2 days ago

No, end users need not get involved, it could/should be handled by the operating system.

  • HKH2 2 days ago

    Gentoo? Compiling big packages takes ages.

    • adrian_b 2 days ago

      More than 20 years, with a single-core Pentium 4, it could take indeed something like 3 days of continuous compilation to compile an entire Gentoo distribution, in order to have a personal computer with every application that one might want.

      However, already after the appearance of the first dual-core AMD Athlon64, 20 years ago, that time could be reduced to not much more than a half of day, while nowadays, with a decent desktop CPU from 5 years ago, most Gentoo packages can be compiled and installed in less than a minute.

      There are only a few packages whose compilation and installation can take a noticeable time, of up to tens of minutes, depending on the chosen options and on the number of cores of the CPU, e.g. Firefox, LibreOffice, LLVM.

      There is only a single package whose compilation may take ages unless you have an expensive CPU and enough memory per core: Google Chromium (including its derivatives that use the same code base).

    • seba_dos1 2 days ago

      I can effortlessly compile Debian packages on my phone.

      With some limits of course. I can't compile Chromium even on my laptop. But most of stuff - I can.

    • James_K a day ago

      Not Gentoo. The system I'm suggesting is one where you replace the linker with one which accepts a platform agnostic assembly code with similar semantics to C, and perhaps some additional tools for stack traversal to allow the implementation of garbage collected languages and exceptions. Compiling would be fast because the code you distribute would have already had optimisations applied to it in a pre-compilation step. It would also be automatic in the same way the present linker system is automatic.

James_K 2 days ago

The idea is that software would only be shipped if it actually works, and part of this would be ensuring it compiles. I'm not suggesting we use Make files or some other foul approach like that. Code should ideally be distributed as a single zip file with an executable header that triggers it to be unpacked in some cache directory by a program analogous to the linker, and compiled by the simple process of compiling all source files. Additionally this could be sped up drastically by having the compiler be a single executable with uniform flags across files. Header files could be parsed once and shared between build threads in parallel, which alone would safe much work on the part of the compiler. If this still proves insufficient, a language with superior semantics to C could be chosen which can be compiled faster. Most code these days shipped is JS which is compiled on the user's computer. This would be an order of magnitude improvement on that from a performance perspective.