Comment by kragen
This is a 521-page CC-licensed book on optimization which looks absolutely fantastic. It starts out with modern gradient-based algorithms rooted in automatic differentiation, including recent things like Adam, rather than the historically more important linear optimization algorithms like the simplex method (the 24-page chapter 12 covers linear optimization). There are a number of chapters on things I haven't even heard of, and, best of all, there are exercises.
I've been wanting something like this for a long time, and I regret not knowing about the first edition.
If you are wondering why this is a more interesting problem than, say, sorting a list, the answer is that optimization algorithms are attempts at the ideal of a fully general problem solver. Instead of writing a program to solve the problem, you write a program to recognize what a solution would look like, which is often much easier, for example with a labeled dataset. Then you apply the optimization algorithm on your program. And that is how current AI is being done, with automatic differentiation and variants of Adam, but there are many other algorithms for optimization which may be better alternatives in some circumstances.
> ideal of a fully general problem solver
In practice that's basically the mindset, but full generality isn't technically possible because of the no free lunch theorem.