Comment by CyberDildonics
Comment by CyberDildonics 2 days ago
Integer linear programming doesn't sound very complex.
Comment by CyberDildonics 2 days ago
Integer linear programming doesn't sound very complex.
This is actually a pretty poor example because we can solve huge TSP instances to optimality in practice (see the concorde solver). There exist many more tricky Combinatorial problems such as packing or covering problems that are typically much harder to solve to optimality.
The np-completeness of it is often not very relevant for practical concerns, as the parent post was mentioning.
There are many applications with an underlying technically np-complete problem for which optimal solutions are easily found for almost all instances you will encounter. Because they are np-complete it's possible to construct instances which aren't efficiently solvable, but that doesn't mean you'll ever even encounter one in practice.
Then even beyond that there are plenty of cases where an efficient approximation with bounded error exists, and that bounded error might be arbitrarily small.
To give an easily understood example, consider a knapsack problem where the items are never larger than 1% of the knapsack size.
There exists a trivial O(N log N) approximation algorithm: sort by value per size, include in that order until no more fit. This yields an optimal solution when the last selected item exactly fills the sack, and otherwise it yields an approximate solution which is suboptimal by no more than the unfilled space, which is no more than 1% due to the size constraint. And 1% is the worse case approximation error, normally less space will be unused and the lowest value per space will be a fraction of the highest value per space, so the actual average approximation error in a real application may be, say, 0.01%. If some other application constraint meant the item sizes would always add up to the sack size then the simple algorithm always gives an optimal solution.
Of course, you can do better than the simple algorithm with some search but even the dumbest algorithm gives very good approximations and is sometimes optimal... and yet the general knapsack (even just the decision form) is a problem which is np-complete. This insight can be quite surprising to anyone who assumes np-complete == unsolvable.
Complexity theory is awesome but it can also lead us astray.
I normally use Simplex method which is fast and not polynomial in the worst case though
You can always just run a portfolio of Simplex/Barrier/PDLP and just grab whichever returns something first. The latter two are usually slower but polynomial time, so you always win.
Can't do that with SAT or ILP.
Even continuous linear programming wasn't known to be in P-time until the 1980s or so.
As nullc mentioned, the simplex method can take exponential time in some pathological examples (a space of measure 0, if that matters). The first P-time linear programming algo (and proof) was the ellipsoid method of Khachian in the 1980s. I have the impression it wasn't really practical. A later P-time interior-point method by Karmarkar is used sometimes today (it beats simplex on certain types of problems) from what I understand. But, I'm nowhere near expert in this area.
Graph vertex 3-colouring (G3C) is NP and NP-Hard, therefore NP-Complete (NPC).
There is a result that say that if you can solve general ILP problems then you can solve general G3C.
Satisfiability is NP and NP-Hard, therefore NP-Complete (NPC). It is therefore equivalent (under some definition of "equivalent") to G3C.
There is a result that say that if you can solve general ILP problems then you can solve general G3C.
There is a known result that if you can solve arbitrary G3C problems then you can factor integers. While the problem of factoring integers (FAC) is not NPC, clearly factoring integers is very important in today's computing environment.
So if you can solve arbitrary ILP problems you can break several current encryption algorithms that are thought to be secure.
So we can deduce that ILP is a fairly tricky problem to solve.
The thing that fools a lot of people is that random instances of NPC problems tend to be easy. The core of difficult instances gets smaller (in relative terms) as the problems get bigger, so if, say, you pick a random graph, it's probably trivial to find a vertex 3-colouring, or show that none such exists.