How to Solve It: Modern HeuristicsNo pleasure lasts long unless there is variety in it. Publilius Syrus, Moral Sayings We've been very fortunate to receive fantastic feedback from our readers during the last four years, since the first edition of How to Solve It: Modern Heuristics was published in 1999. It's heartening to know that so many people appreciated the book and, even more importantly, were using the book to help them solve their problems. One professor, who published a review of the book, said that his students had given the best course reviews he'd seen in 15 years when using our text. There can be hardly any better praise, except to add that one of the book reviews published in a SIAM journal received the best review award as well. We greatly appreciate your kind words and personal comments that you sent, including the few cases where you found some typographical or other errors. Thank you all for this wonderful support. |
Contents
1 | |
8 | |
What Are the Numbers? | 16 |
How Important Is a Model? | 31 |
What Are the Prices in 711? | 49 |
Traditional Methods | 60 |
Whats the Color of the Bear? | 111 |
2 | 118 |
Tuning the Algorithm to the Problem | 277 |
21 | 283 |
2 | 300 |
Can You Mate in Two Moves? 303 | 302 |
Day of the Week of January 1st | 335 |
What Was the Length of the Rope? | 363 |
Everything Depends on Something Else | 389 |
26 | 401 |
Other editions - View all
Common terms and phrases
adaptive answer applied approach assigned best solution better binary strings branch and bound candidate solution chapter choice choose complete solutions component consider constraints converge cost crossover current solution defined determine distance edges eval evaluation function evolutionary algorithm example feasible region feasible solution figure flip fuzzy set Gaussian global greedy algorithm GSAT heuristic hill-climbing infeasible individuals infeasible solutions initial input iteration length Lin-Kernighan linear local optima local search matrix memory method minimize move multiobjective optimization mutation neighborhood neural network neuron node nonlinear Note offspring optimization problems optimum output parameters parents particular path penalty permutation population probability problem solving procedure random variable real-world problems representation represents requires search space selection self-adaptation simple simulated annealing step strategy structure subtours swap tabu search techniques there's tion tour variation operators vector weights zero