Comment by sirwhinesalot
Comment by sirwhinesalot a day ago
Apologies for the wall of text below, but it's required to explain it.
It's not that cuts themselves are a problem, they're a symptom of a larger problem, of which unrestricted IO is another. They both exist because a Prolog implementation is intimately tied to a specific search strategy. One that you, as a programmer, must learn and exploit.
And because you can use cuts and IO, the search strategy must remain what it is. It's an ouroboros. A snake eating its own tail. This enforced search strategy makes Prolog a poor declarative language (despite all the neat parlor tricks you can do with it).
Why? As we climb the abstraction ladder, we relinquish control. This is a double-edged sword. We can express problems orders of magnitude more succinctly, but we must trust the machine to arrive at the solution efficiently and with limited guidance.
For example, C might be considered low level in comparison to most languages, but it is extremely high level compared to assembly.
C compilers can generate efficient machine code precisely because certain details are abstracted away: register allocation, function inlining, loop unrolling, constant-folding, dead-code elimination, etc.
These are only possible thanks to various higher-level concepts being available (variables, constants, functions, loops, conditionals, etc.) and lower-level concepts being unavailable (e.g., unrestricted goto). You have inline assembly, but it's restricted to function bodies, and you have to tell the compiler which registers you touch, or you'd prevent it from applying many optimizations entirely.
Many of the supposedly higher-level languages (like Java, C#, etc.), aren't really higher-level than C. You can express the same things in C, just slightly more verbose. Garbage Collection? You can bolt a garbage collector to C. Generators (yield)? I can implement that with macros. Generics? If void* is too cumbersome, a template file and some include tricks is a few lines of code away. They're just a pile of admittedly nice conveniences.
A pure functional language though? That could allow us to go a level higher. It's not the pattern matching or the higher order functions, however, those are still things C can express. No, the key is the referential transparency and lack of sequencing. Referential transparency means order of execution no longer matters. And if order of execution no longer matters, then we can parallelize everything.
We don't have a good strategy for automating this in a compiler yet, but the capability is there. Large scale MapReduce is only possible because we've abstracted away loops into those higher-level operations (map & reduce) calling referentially transparent functions.
Prolog didn't relinquish enough control. We can't express certain problems without resorting to weird dances to appease the poor search algorithm, same as how I might have to contort the way I write a C program so the compiler can vectorize the code.
If we relinquish enough control to arrive at Datalog (not even Turing-complete), we can make use of all the tools available to the relational model. The sort of dynamic query planning and optimization database engines use. A human can write a better query plan than a database engine, but a human cannot write a tailored query plan every few microseconds.
In Datalog I don't have to worry much about how I write my rules. I can have them be mutually recursive, I can have multiple rules with the same pattern in the head, I can write the body in a really inefficient order. A Datalog engine is free to optimize all of that out. I can't use higher-order predicates, but I can in ASP, which also has massively more efficient search than Prolog.
Even in other Turing-complete logic programming languages, like Mercury and Curry, I have a lot more freedom in how I express the problem because they have the freedom to optimize the code in a way that allows the solution to be arrived at efficiently. Curry can guarantee a solution will be arrived at if one exists, Prolog cannot.
The extra-logical control Prolog provides (like cuts) only serves as a tool to work around the poor search algorithm it uses, one it cannot change because of the control it gives. To me it's just an evolutionary dead-end, albeit one of great historical importance.
The wall of text is fine, but I am not convinced. If you really have a real-life, practical problem with cuts, then your comment fails to articulate it, because I didn't understand what it is meant to be. If it is a matter of aesthetics - higher order languages look prettier and we could, in theory (but not in practice) parallelise them - then I don't care about that. Aesthetics has no place in pragmatic considerations like the amount of resources you really need, to be able to use a language. Case in point:
>> We don't have a good strategy for automating this in a compiler yet, but the capability is there
And it's been "there" since the 1960s, but it's still not here. Looks pretty on paper, doesn't work on metal. Close, but no cigar.
>> They both exist because a Prolog implementation is intimately tied to a specific search strategy.
That is not right. There are different execution stategies for Prolog, other than DFS. In fact, Datalog's bottom-up evaluation with a TP-operator, is one of those: you can apply it to Polog programs just fine, except without the nice termination guarantees; but then, Church-Turing thesis.
Another execution strategy is SLG-Resolution with tabulation, a.k.a. tabling, which I mentioned in my other comment. I copy the link to SWI-Prolog's implementation here again:
https://www.swi-prolog.org/pldoc/man?section=tabling
Tablng is as old as nails in the Prolog community [1]. Besides which there is a ton of computer science knowledge you can bring to bear on the problem of avoiding non-termination, or unnecessary backtacking. If that is, indeed, the problem. Like I say, I don't understand from your comment what is your real-life, real-world, practical problem with using cuts.
>> Prolog didn't relinquish enough control. We can't express certain problems without resorting to weird dances to appease the poor search algorithm, same as how I might have to contort the way I write a C program so the compiler can vectorize the code.
I don't understand what that means. Prolog is FOL running on a digital computer. One is a computational paradigm, the other is a computational device. There is a mismatch between the two, unfortunately and inevitably, because the difference between theory and practice etc etc, and there are concessions that had to be made to be able to run one on top of the other. Messrs Kowalksi and Colmerauer (the man whose name not even French Computer Scientists can pronounce- true story) alighted on a set of choices that had the advantage of being the most practical possible, for the purpose of having a real language, that one can write real code in, to run on real computers; unlike, say, Haskell.
And you can do a lot more than just "write code". Prolog, because its interpreter is an implementation of SLD-Resolution, is not just a programming language but a sound and (refutation-) complete deductive inference system for FOL. And with recent advances in Inductive Logic Programming, it turns out it's also a sound and refutation complete inductive inference system, too.
Another couple of points:
DFS is not a poor choice of a search algorithm. It's an efficient one. You say ASP is "massively more efficient" than Prolog. It's not. ASP interpreters must first solve an NP-complete problem, which they only do thanks to powerful, special-purpose solvers. Those solvers are fast [2], but efficiency is a whole oher ball game (we don't have efficient algorithms for NP-complete problems). The NP-complete problem is that, because of ASP's bottom-up interpretation, the entire Herbrand base of a predicate must be fully ground. That's why ASP also has no functions [3]; so, no lists. The big advantage of Resolution, its One Neat Trick as an algorithm, is exactly that it doesn't have to ground the entire Herbrand base before finding a proof, if one exists. That's thanks to unification. More to the point, as Resolution proceeds, it prunes the search space of proofs, just by discarding branches starting at goals that fail unification. ASP can't do that [4,5].
See, every time anyone has tried to "fix" Prolog's not-fully-100%-declarativeness, they've just broken it.
Or, as I like to say about logic programming in general; sound, complete, efficient: choose two.
_______________
[1] OLD-Resolution with Tabulation; Tamaki and Sato, 1986 https://dl.acm.org/doi/10.5555/12069.12075
[2] And cool.
[3] Because functions turn a predicate's Herbrand base infinite, and then it takes an infinite amount of time to ground it.
[4] Well, ish. Standard, bottom-up ASP can't but there are different ways to evaluate ASP ... programs, too. For instance, you can evaluate them top-down, with unification, just like in Prolog. See sCASP (first link I could find; dig deeper):
http://platon.etsii.urjc.es/~jarias/slides/gde21-tutorial.pd...
[5] It's not just unification pruning the search space of proofs, either. It's that the cardinality of the resolution closure of a set of Horn clauses decreases monotonically until either the empty clause is constructed by resolution, or... well, infinity.