sirwhinesalot a day ago

Apologies for the wall of text below, but it's required to explain it.

It's not that cuts themselves are a problem, they're a symptom of a larger problem, of which unrestricted IO is another. They both exist because a Prolog implementation is intimately tied to a specific search strategy. One that you, as a programmer, must learn and exploit.

And because you can use cuts and IO, the search strategy must remain what it is. It's an ouroboros. A snake eating its own tail. This enforced search strategy makes Prolog a poor declarative language (despite all the neat parlor tricks you can do with it).

Why? As we climb the abstraction ladder, we relinquish control. This is a double-edged sword. We can express problems orders of magnitude more succinctly, but we must trust the machine to arrive at the solution efficiently and with limited guidance.

For example, C might be considered low level in comparison to most languages, but it is extremely high level compared to assembly.

C compilers can generate efficient machine code precisely because certain details are abstracted away: register allocation, function inlining, loop unrolling, constant-folding, dead-code elimination, etc.

These are only possible thanks to various higher-level concepts being available (variables, constants, functions, loops, conditionals, etc.) and lower-level concepts being unavailable (e.g., unrestricted goto). You have inline assembly, but it's restricted to function bodies, and you have to tell the compiler which registers you touch, or you'd prevent it from applying many optimizations entirely.

Many of the supposedly higher-level languages (like Java, C#, etc.), aren't really higher-level than C. You can express the same things in C, just slightly more verbose. Garbage Collection? You can bolt a garbage collector to C. Generators (yield)? I can implement that with macros. Generics? If void* is too cumbersome, a template file and some include tricks is a few lines of code away. They're just a pile of admittedly nice conveniences.

A pure functional language though? That could allow us to go a level higher. It's not the pattern matching or the higher order functions, however, those are still things C can express. No, the key is the referential transparency and lack of sequencing. Referential transparency means order of execution no longer matters. And if order of execution no longer matters, then we can parallelize everything.

We don't have a good strategy for automating this in a compiler yet, but the capability is there. Large scale MapReduce is only possible because we've abstracted away loops into those higher-level operations (map & reduce) calling referentially transparent functions.

Prolog didn't relinquish enough control. We can't express certain problems without resorting to weird dances to appease the poor search algorithm, same as how I might have to contort the way I write a C program so the compiler can vectorize the code.

If we relinquish enough control to arrive at Datalog (not even Turing-complete), we can make use of all the tools available to the relational model. The sort of dynamic query planning and optimization database engines use. A human can write a better query plan than a database engine, but a human cannot write a tailored query plan every few microseconds.

In Datalog I don't have to worry much about how I write my rules. I can have them be mutually recursive, I can have multiple rules with the same pattern in the head, I can write the body in a really inefficient order. A Datalog engine is free to optimize all of that out. I can't use higher-order predicates, but I can in ASP, which also has massively more efficient search than Prolog.

Even in other Turing-complete logic programming languages, like Mercury and Curry, I have a lot more freedom in how I express the problem because they have the freedom to optimize the code in a way that allows the solution to be arrived at efficiently. Curry can guarantee a solution will be arrived at if one exists, Prolog cannot.

The extra-logical control Prolog provides (like cuts) only serves as a tool to work around the poor search algorithm it uses, one it cannot change because of the control it gives. To me it's just an evolutionary dead-end, albeit one of great historical importance.

  • YeGoblynQueenne 21 hours ago

    The wall of text is fine, but I am not convinced. If you really have a real-life, practical problem with cuts, then your comment fails to articulate it, because I didn't understand what it is meant to be. If it is a matter of aesthetics - higher order languages look prettier and we could, in theory (but not in practice) parallelise them - then I don't care about that. Aesthetics has no place in pragmatic considerations like the amount of resources you really need, to be able to use a language. Case in point:

    >> We don't have a good strategy for automating this in a compiler yet, but the capability is there

    And it's been "there" since the 1960s, but it's still not here. Looks pretty on paper, doesn't work on metal. Close, but no cigar.

    >> They both exist because a Prolog implementation is intimately tied to a specific search strategy.

    That is not right. There are different execution stategies for Prolog, other than DFS. In fact, Datalog's bottom-up evaluation with a TP-operator, is one of those: you can apply it to Polog programs just fine, except without the nice termination guarantees; but then, Church-Turing thesis.

    Another execution strategy is SLG-Resolution with tabulation, a.k.a. tabling, which I mentioned in my other comment. I copy the link to SWI-Prolog's implementation here again:

    https://www.swi-prolog.org/pldoc/man?section=tabling

    Tablng is as old as nails in the Prolog community [1]. Besides which there is a ton of computer science knowledge you can bring to bear on the problem of avoiding non-termination, or unnecessary backtacking. If that is, indeed, the problem. Like I say, I don't understand from your comment what is your real-life, real-world, practical problem with using cuts.

    >> Prolog didn't relinquish enough control. We can't express certain problems without resorting to weird dances to appease the poor search algorithm, same as how I might have to contort the way I write a C program so the compiler can vectorize the code.

    I don't understand what that means. Prolog is FOL running on a digital computer. One is a computational paradigm, the other is a computational device. There is a mismatch between the two, unfortunately and inevitably, because the difference between theory and practice etc etc, and there are concessions that had to be made to be able to run one on top of the other. Messrs Kowalksi and Colmerauer (the man whose name not even French Computer Scientists can pronounce- true story) alighted on a set of choices that had the advantage of being the most practical possible, for the purpose of having a real language, that one can write real code in, to run on real computers; unlike, say, Haskell.

    And you can do a lot more than just "write code". Prolog, because its interpreter is an implementation of SLD-Resolution, is not just a programming language but a sound and (refutation-) complete deductive inference system for FOL. And with recent advances in Inductive Logic Programming, it turns out it's also a sound and refutation complete inductive inference system, too.

    Another couple of points:

    DFS is not a poor choice of a search algorithm. It's an efficient one. You say ASP is "massively more efficient" than Prolog. It's not. ASP interpreters must first solve an NP-complete problem, which they only do thanks to powerful, special-purpose solvers. Those solvers are fast [2], but efficiency is a whole oher ball game (we don't have efficient algorithms for NP-complete problems). The NP-complete problem is that, because of ASP's bottom-up interpretation, the entire Herbrand base of a predicate must be fully ground. That's why ASP also has no functions [3]; so, no lists. The big advantage of Resolution, its One Neat Trick as an algorithm, is exactly that it doesn't have to ground the entire Herbrand base before finding a proof, if one exists. That's thanks to unification. More to the point, as Resolution proceeds, it prunes the search space of proofs, just by discarding branches starting at goals that fail unification. ASP can't do that [4,5].

    See, every time anyone has tried to "fix" Prolog's not-fully-100%-declarativeness, they've just broken it.

    Or, as I like to say about logic programming in general; sound, complete, efficient: choose two.

    _______________

    [1] OLD-Resolution with Tabulation; Tamaki and Sato, 1986 https://dl.acm.org/doi/10.5555/12069.12075

    [2] And cool.

    [3] Because functions turn a predicate's Herbrand base infinite, and then it takes an infinite amount of time to ground it.

    [4] Well, ish. Standard, bottom-up ASP can't but there are different ways to evaluate ASP ... programs, too. For instance, you can evaluate them top-down, with unification, just like in Prolog. See sCASP (first link I could find; dig deeper):

    http://platon.etsii.urjc.es/~jarias/slides/gde21-tutorial.pd...

    [5] It's not just unification pruning the search space of proofs, either. It's that the cardinality of the resolution closure of a set of Horn clauses decreases monotonically until either the empty clause is constructed by resolution, or... well, infinity.

    • sirwhinesalot 7 hours ago

      Sure, I like to think of the more philosophical aspects but I understand they're not something people necessarily care about. If you prefer just the practical side of things then that's a lot easier to explain: I didn't like working with Prolog.

      So what follows is purely anecdotal of course, but it can't be anything else really. At my university we were exposed to all the 4 major paradigms: Procedural (C), OOP (Java), Functional (Haskell), Logic (Prolog).

      When I first got exposed to Prolog, I was amazed. "Wow, you can just write the rules and it figures out the solution!". That didn't last long. It was slow and clunky, I had to be careful how I wrote the rules, and debugging was an exercise in frustration. Sometimes it ran forever, sometimes it gave repeated answers, sometimes it gave all the right answers and then ran forever.

      Cuts were introduced as the solution to the performance issues, and to me they were witchcraft more than they were engineering. Something you threw around at random in the hopes of making it work. I tried to go all in with it, make the whole class project as much as possible in Prolog (it was a Java/Prolog mix).

      My professor actually ended up giving me a bad grade because that's not how he wanted Prolog to have been used. He actually wanted us to use it more like one might use Datalog (with good reason, but Datalog was never explicitly mentioned).

      At the time I didn't get it, the whole paradigm just felt like a complete joke to me. Something researchers liked to play with but unfit for any real work. That was also the perception of the vast majority of my colleagues.

      Same was true of functional programming by the way, though to a lesser extent. I (and others) could definitely see the benefits of referential transparency, higher order functions and lazy evaluation. But I'd just apply them in whatever language I was already using. Same as how if I wanted to do search, I'd just write a depth first search algorithm myself, no reason to bother with Prolog.

      All examples were either trivial nonsense, ridiculous contortions of the paradigm (a window manager? really?), or "irrelevant" research (research about Prolog using Prolog to do more Prolog).

      But the problem is that my perception of Prolog extended to my perception of logic programming in general. I saw them as one and the same. The "best" and "most famous" logic programming language is jank, so the whole paradigm must be jank.

      Only after I was later exposed to SMT solvers did it finally "click" that different programming paradigms can be truly transformative. That they can provide truly massive productivity boosts. Z3 added a Datalog mode at some point, which was when I realized my perception of the logic paradigm was clouded.

      From Z3 Datalog I moved on to Souffle Datalog and Datomic (that syntax though, ugh), where I got to experience how poor SQL truly is. I already knew it was a poor showing of the relational paradigm, but I didn't realize the relational paradigm itself was so much weaker than the real deal: logic programming.

      So that convinced me logic programming had value. Real, undeniable, practical value, not "parlor tricks". But that was still "the non-Turing-complete fragment of logic programming is useful", not "logic programming is useful". Because full logic programming still meant "Prolog" to me, the slow, janky language I used in university.

      But then I got to experience Mercury and Curry. Full, Turing-complete logic programming languages with massively better performance than Prolog, that don't need cuts to perform well. Mercury has its determinism modes which make debugging the behavior of rules so much easier (much as how static typing in general makes everything easier to debug).

      Curry's "needed narrowing" semantics also means if there is a solution, it will find it. This is true even though Curry is also a Turing-complete logic programming language. Prolog can get stuck in a branch of the search tree that cannot lead to a solution.

      Mercury and Curry are both strong evidence to me that "every time anyone has tried to fix Prolog's not-fully-100%-declarativeness, they've just broken it." is not really accurate.

      Can you give me a serious example that would not be expressible in either? Another commenter mentioned meta-interpreters but while they're incredibly cool, I'd still classify them more as a great research tool than a practical one. How often do you need one?

      • YeGoblynQueenne 5 hours ago

        >> If you prefer just the practical side of things then that's a lot easier to explain: I didn't like working with Prolog.

        Great, but that's aesthetics. You'd save a lot of time by being upfront about that. De gustibus non est disputandum.

        I note that your aversion for Prolog comes from a course you had in -undergrad, I think? I fell in love with Prolog in my undergrad course and even undertook a substantial project in it, where I loved it even more- but I didn't really get it until my PhD where I had the time to really sit down and study, not just Prolog as a programming language, but logic programming, as a thing unto itself. I didn't even know what sources to read, and I had read all the Prolog textbooks I could find. That's not to gatekeep, or pull rank, but logic programming is a deep and old subject and one really needs guidance to get to grips with it which you just don't get in undergrad (not least because many tutors themselves aren't that deeply knowledgeable in the subject).

        I note also that you seem to have been introduced to Prolog in terms of its vaunted declarative qualities: "you just write the rules". That's probably the worst lie one can tell to students and the worst possible way to introduce Prolog, because, as you found out, Prolog just doesn't work that way. For example, back in my undergrad, I only learned about the "declarative paradigm" much later on and even then I thought it was more of a gimmick, the kind of argument people reach for when they try to convince others to be excited about the same things they're excited. This kind of argument is supposed to go beyond aesthetics, but it fails to do so. For example, why is declarative programming better than procedural? It isn't. It's just a matter of taste.

        The first thing I paid attention to about Prolog was "automated theorem proving", which was what blew my mind. I had just noticed the way manual theorem proving proceeds in a very mechanical manner and was thinking it should be possible to write a program to do it, when I found out someone had done it, for real, and much better than I ever could. I don't think that's aesthetics. It's a real, and distinct, capability, that you don't find e.g. in C or Haskell. How important it is, that's again a personal matter. For me, plenty. For others, not so much. De gustibus etc.

        The right way to introduce Prolog is the long way around, starting with propositional logic, then FOL, then logic programming, then Resolution, and then only at the end introducing Prolog and other logic programming languages. The way Prolog is taught now, it just doesn't work and everyone leaves university without understanding a thing about it.

        I can't give a "serious example" of using Mercury or Curry because I have only checked them out once or twice and don't have experience with them. I don't have anything against them, to be clear, as I don't have anything against Datalog. I prefer Prolog because it's more mature, and its implementations more feature-rich. SWI-Prolog in particular is bedecked with libraries like a swiss army knife and its maintainer is a programming super-hero. But if you like Mercury then you could possibly help with its development, which I understand has slowed down for lack of interest, which is a big shame.

        On meta-interpreters, I use one all the time. The Inductive Logic Programming system that I created during my PhD is based on an inductive, second-order Prolog meta-interpreter:

        https://github.com/stassa/louise

        You'll find the early version of that meta-interpreter in https://github.com/stassa/louise/blob/master/src/vanilla.pl A new version, decoupled from any specific learning system, is in a private repo. Let me know if you're very very interested. A write up is here:

        https://hmlr-lab.github.io/pdfs/Second_Order_SLD_ILP2024.pdf

        >> Prolog can get stuck in a branch of the search tree that cannot lead to a solution.

        There's more ways to execute Prolog than DFS so that you don't get stuck in loops. Obviously there'll always be infinite proof branches because Church-Turing. I'm not sure you fully appreciate that?

        • sirwhinesalot 4 hours ago

          Your comment is very fair, indeed perhaps the "way it is taught" is a major problem. Me, personally, I was quite happy to later find out that the "you just write the rules" declarative promise is true, just with caveats, i.e., you have to restrict yourself to a subset of what could otherwise be expressed.

          That might seem like an annoying limitation, but to me it is not. Most problems fit well within the bounds, and you get to reap the benefits.

          >> There's more ways to execute Prolog than DFS so that you don't get stuck in loops. Obviously there'll always be infinite proof branches because Church-Turing. I'm not sure you fully appreciate that?

          It doesn't have to do with Church-Turing. The issue is with DFS specifically which can get stuck in an unproductive search branch. This is mitigated by Mercury with termination checks and mode determinism, and entirely solved by Curry through an execution semantics that guarantees fair search (only way to not terminate is if there is no solution).

          If you can execute Prolog in a different manner then I'm happy that's the case but I find it hard to believe that's still Prolog. How would the IO come out in the same order?

          >> But if you like Mercury then you could possibly help with its development, which I understand has slowed down for lack of interest, which is a big shame.

          Sadly I lack the skillset...

          >> I prefer Prolog because it's more mature, and its implementations more feature-rich

          My preference for the semantics of the far less popular logic programming languages likely clouds my judgements towards Prolog more than I'd like to admit.