Skip to main content

Heuristic Search

  • Chapter
  • First Online:
Fundamentals of Artificial Intelligence
  • 12k Accesses

Abstract

This chapter provides in depth study of heuristic search methods—the methods for searching the goal (solution) to problems, that are more like human, and do not follow the exhaustive search approach, making them far more efficient than the uninformed search methods. The introduction starts with formal definition of heuristic search, then follows Hill-climbing searches, their algorithm and analysis, best-first search, its algorithm and analysis, optimization, A-star search, and approaches to better heuristics. Finally, the search methods—simulated annealing (based on treatment of metals), Genetic Algorithm (GA)-based search method, along with their analyses are presented, followed with chapter summary, and at end an exhaustive list of practice exercises, along with multiple-choice questions are provided.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 84.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The NP-Complete problems require exponential power of computing efforts, in terms of n, i.e., \(k^n\).

  2. 2.

    In problems such as theorem proving, the search must continue until a proof is found.

  3. 3.

    \(f^*\) is also known as evaluation function.

References

  1. Bagchi A, Mahanti A (1985) Three approaches to heuristic search in networks. J ACM 32: I:1–27

    Google Scholar 

  2. Dechter R, Pearl J (1985) Generalized Best-First Search strategies and the optimality of \(A^*\). J ACM 32(3):505–536

    Article  MathSciNet  Google Scholar 

  3. Forrest S (1993) Genetic algorithms: principles of natural selection applied to computation. Science 261:872–878

    Article  Google Scholar 

  4. Goldberg DE (1989) Genetic algorithms in search, optimization and machine learning. Addison-Wesley, Reading

    MATH  Google Scholar 

  5. Goldberg DE (1994) Genetic and evolutionary algorithms come of age. Commun ACM 37(3):113–119

    Article  Google Scholar 

  6. Hart PE et al (1968) A formal basis for the heuristic determination of minimum cost paths. IEEE Trans Syst Sci Cybern 100–107

    Google Scholar 

  7. Kirkpatrick S et al (1983) Optimization by simulated annealing. Science 220(4598):671–680

    Article  MathSciNet  Google Scholar 

  8. Korf RE et al (2005) Frontier search. J ACM 52(5):715–748

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to K. R. Chowdhary .

Exercises

Exercises

  1. 1.

    Answer the following short review questions.

    1. a.

      In what condition the best-first search becomes the breadth-first?

    2. b.

      What can you infer from the condition: \(f(n) = g(n)\)?

    3. c.

      What can you infer from the condition: \(f(n) = h(n)\)?

    4. d.

      In what situation the \(A^*\) search become best-first search?

    5. e.

      What is the primary drawback of best-first search?

    6. f.

      Which search method(s) use the priority-queue data structure?

  2. 2.

    Consider the 3-puzzle problem, where the board is \(2 \times 2\) and there are three tiles, numbered 1, 2, and 3, and blank tile. There are four operators, which move the blank tile up, down, left, and right. The start and goal states are given in Fig. 9.9. Show, how the path to the goal can be found using

    1. a.

      Breath-first search.

    2. b.

      Depth-first search.

    3. c.

      \(A^*\) search having g(n) equal to number of moves from start state, and h(n) is number of misplaced tiles.

    Assume that there is no possibility to remember states that have been visited earlier. Also, use the given operators in the given order unless the search method defines otherwise. Label each visited node with a number indicating the order in which they are visited. If a search method does not find a solution, explain why this happened.

Fig. 9.9
figure 9

State-space search

  1. 3.

    Explain what algorithms or heuristics are suitable for solving constraint satisfaction problems under the following situations. Justify your answers.

    1. a.

      The problem is so tightly constrained that it is highly unlikely that solutions exist.

    2. b.

      The domain sizes vary significantly: some variables have very large domains (over 1,000 values) and some have very small domains (with fewer than 10 values).

    3. c.

      Eight-Queens Problem: Arrange eight queens on a chess board in such a manner that none of them can attack any of the others. (Note: A queen will attack another queen if it is crossing other queen while moving horizontally, vertically, or diagonally).

    4. d.

      The set of variables and set of domains are handled by a computer say, M. Each constraint is handled by a networked computer, say N. Traffic in the networks is slow. To check a particular constraint, computer M sends a message to computer N through the network, which in turn will send a message back to indicate whether the constraint is satisfied or violated.

  2. 4.

    Suggest a heuristic function for the 8-puzzle that sometimes overestimates, and show how it can lead to a suboptimal solution on a particular case.

  3. 5.

    Prove that, if the heuristic function h never overestimates by more than a constant cost c, then algorithm \(A^*\) making use of h returns a solution whose cost exceeds that of the optimal solution by no more than c.

  4. 6.

    Give the name of the algorithms that results from each of the following special cases:

    1. a.

      Local beam search with \(k = 1\).

    2. b.

      Local beam search with \(k = \infty \).

    3. c.

      Simulated annealing with \(T = 0\) at all times.

    4. d.

      Genetic algorithm with population size \(N = 1\).

  5. 7.

    Explain, how will you use best-first search in each of the following cases? Give the data structure and explain logic.

    1. a.

      Speech recognition

    2. b.

      PCB design

    3. c.

      Routing telephone traffic

    4. d.

      Routing Internet traffic

    5. e.

      Scene analysis

    6. f.

      Mechanical theorem proving

  6. 8.

    What type of data structure is suitable for implementing best-first search, such that each node in the frontier is directly accessible, and all the vertices behind it remain in the order they have been visited.

  7. 9.

    Answer the following in one sentence/one word.

    1. a.

      How will you detect during the search of a graph that a particular node has been already visited?

    2. b.

      Is the best-first search optimal?

    3. c.

      Is the best-first search order-preserving?

    4. d.

      Is the best-first search admissible?

  8. 10.

    In the Traveling Salesperson Problem (TSP) one is given a fully connected, weighted, undirected graph and is required to find the Hamiltonian cycle (a cycle visiting all of the nodes in the graph exactly once) that has the least total weight.

    1. a.

      Outline how hill-climbing search could be used to solve TSP.

    2. b.

      How good results would you expect hill-climbing to attain?

    3. c.

      Can other local search algorithms be used to solve TSP?

  9. 11.

    Show that if a heuristic is consistent, then it can never overestimate the cost to reach the goal state. In other words, if a heuristic is monotonic, then it is admissible.

  10. 12.

    Suggest an admissible heuristic that is not consistent.

  11. 13.

    Can GAs have Local maximas? If it is not, how does the GAs tries to avoid it? If yes, justify it.

  12. 14.

    Explain different data structures that can be used to implement the open list in BFS, DFS, Best-first search?

  13. 15.

    Find out the worst-case memory requirements for best-first search.

  14. 16.

    If there is no solution, will \(A^*\) explore the whole graph? Justify.

  15. 17.

    Define and describe the following terms related to heuristics:

    Admissibility, monotonicity, informedness.

  16. 18.

    Show that:

    1. a.

      The \(A^*\) will terminate ultimately.

    2. b.

      During the execution of the \(A^*\) algorithm, there is always a node in the open-list, that lies on the path to the goal.

    3. c.

      If there exits a path to goal, the algorithm \(A^*\) will terminate by finding the path of to goal.

    4. d.

      If there is no solution in \(A^*\), the algorithm will explore the whole graph.

  17. 19.

    Discuss the ways using which h function in \(f(n) = g(n) + h(n)\) can be improved during the search.

  18. 20.

    Why must the \(A^*\) algorithm work properly on a graph search, with graph having cycles?

  19. 21.

    For the graph shown in Fig. 9.5, find out whether the \(A^*\) search for this graph is

    1. a.

      Optimal?

    2. b.

      Oder-preserving?

    3. c.

      Complete?

    4. d.

      Sound?

  20. 22.

    Apply the BFS algorithm for robot path planing in the presence of obstacles for a square matrix of \( 8 \times 8\) given in Fig. 9.10. Write an algorithm to generate the frontier paths. Assume that each move of robot in horizontal (H) and vertical (V) covers a unit distance, and the robot can take only the H and V moves. The start and goal nodes are marked as S and G. Shaded tiles indicate obstacles, i.e., robot cannot pass through these.

Fig. 9.10
figure 10

\(8 \times 8\) tiles, with obstacles in shades

Fig. 9.11
figure 11

Magic-puzzle

  1. 23.

    Redesign the problem of robot shown in Fig. 9.10 for \(A^*\) search. Assume that value of h is number of squares equal to \(V-i + H-j\), where V and H are both 8.

  2. 24.

    Solve the 8-puzzle manually for 20-steps, where heuristic is number of tiles out of place with respect to goal state. Assume that \(f^*(n) = g^*(n) + h^*(n)\) \(= 0 + h^*(n) = h^*(n)\), so that only the heuristics is deciding factor for next node. Note that algorithm shall be DFS. In case of ties, give preference to those nodes which are to the left of the search tree.

  3. 25.

    Find an appropriate state-space representation for the following problems. Also, suggest suitable heuristics for each.

    1. a.

      Cryptarithmetic problems (e.g., TWO \(+\) TWO \(=\) FOUR)

    2. b.

      Towers of Hanoi

    3. c.

      Farmer, Fox, Goose, and Grain.

  4. 26.

    Suggest appropriate heuristics for each of the following problems:

    1. a.

      Theorem proving using resolution method

    2. b.

      Blocks world

  5. 27.

    If \(P=\) “heuristic is consistent”, and \(Q=\) “heuristic is admissible”. Then show that \(P\Rightarrow Q\). Demonstrate by counter example that \(Q \nRightarrow Q\).

  6. 28.

    Consider the magic-puzzle shown in Fig. 9.11. Suggest the formalism for searching the goal state when started from the start state. (Note that in the goal state all the rows, columns, and diagonals have equal sums equal to 15).

  7. 29.

    Make use of GA to solve 4-puzzle (Fig. 9.12). A move consists, sliding of either of tiles 1 or 2, or 3 into the blank tile. Such a movement creates blank tile at a different position, and the process is repeated until goal state is reached. The solution requires not only reaching to the goal state, but also finds the trace path to reach the goal. Construct a suitable fitness function to implement search by GA, the search should consider only those members of the population which correspond to valid moves.

  8. 30.

    For the graph shown in Fig. 9.13, make use of DFS and certain depth cut-off to backtrack the search from that cut-off.

Fig. 9.12
figure 12

4-puzzle

Fig. 9.13
figure 13

A graph with start node S goal G

Fig. 9.14
figure 14

Graph with start node A and goal node G

  1. 31.

    Use best-first search for Fig. 9.14 to find out if the search from start node A to goal node G is

    1. a.

      Oder-preserving

    2. b.

      Admissible

    3. c.

      Optimal

  2. 32.

    How you will apply the simulated annealing in the following scenarios? For each case, give the problem formation so as to compute the \(\varDelta E\), temperature T, the states \(s, s^\prime \); the state space \(\mathbf {S}\), and object function \(f: \mathbf {S} \rightarrow \mathbb {R}\), and perform 5–10 iterations steps manually.

    1. a.

      8-puzzle

    2. b.

      8-Queen problem

    3. c.

      Tic-tac-toe problem

  3. 33.

    One fine morning you find that your laptop is not booting. There can be enumerable reasons for this. Assume that you are expert in installation and maintenance of laptops. Represent the search process for trouble-shooting the laptop by constructing a search tree.

    1. a.

      Suggest, what search method you consider as most appropriate for this? Also, explain the justification of the particular method you have chosen?

    2. b.

      What heuristics you would like to suggest for making the search efficient?

    3. c.

      What are the characteristics of this search? Comment for admissibility, monotonocity, and completeness of this solution.

  4. 34.

    Assume a population of 10 members, and apply GA to find out the solution by performing five cycles of iterations, each having selection, mutation, and crossover. Verify that we are far closer to the solution than we were in the begin after performing these iterations. Represent the members as bit strings \(\{0, 1\}^n\) for some integer n. Also, fix up some criteria for fitness function, as well as the probability of mutation, and point of crossover.

    1. a.

      8-puzzle

    2. b.

      square-root of a number

    3. c.

      Factors of an integer

  5. 35.

    What are the consequences of the following special cases of GA-based search?

    1. a.

      Only the selection operation is performed in each iteration, based on the fitness value.

    2. b.

      Only the crossover operation is performed in each iteration at a random position.

    3. c.

      Only the mutation operation is performed in each iteration at a random bit position.

  6. 36.

    Simulated annealing is guided by a changing “temperature” value that determines the likelihood of visiting nodes that appear to be worse than the current node. How does the search behave for very low and very high temperature values, and why it behaves so?

  7. 37.

    Select the best alternatives in each of the following questions.

    1. i.

      The mutation operation is good for the following:

      \( \begin{array}{ll} \text {(a) noise tolerance} &{} \text {(b) hill-climbing}\\ \text {(c) random walk} &{} \text {(d) all above} \\ \end{array}\)

    2. ii.

      The following operation of GA has maximum contribution to search:

      \(\begin{array}{ll} \text {(a) mutation} &{} \text {(b) selection}\\ \text {(c) crossover} &{} \text {(d) fitness function}\\ \end{array}\)

    3. iii.

      What operation of GA is responsible for random walk?

      \(\begin{array}{ll} \text {(a) mutation}&{} \text {(b) crossover} \\ \text {(c) none above}&{} \text {(d) both a and b}\\ \end{array}\)

    4. iv.

      GAs are not good for the following purpose:

      \(\begin{array}{ll} \text {(a) finding exact global optimum} &{} \text {(b) local search} \\ \text {(c) approximate solution}&{} \text {(d) global search} \\ \end{array}\)

    5. v.

      GAs are good in environments which are :

      \(\begin{array}{ll} \text {(a) complex} &{} \text {(b) noisy} \\ \text {(c) dynamic} &{} \text {(d) all above} \\ \end{array}\)

    6. vi.

      GA is always:

      \(\begin{array}{ll} \text {(a)}\,\mathbf {P} &{} \text {(b) nondeterministic} \\ \text {(c)}\,\mathbf {NP} &{} \text {(d) deterministic} \\ \end{array}\)

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature India Private Limited

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Chowdhary, K.R. (2020). Heuristic Search. In: Fundamentals of Artificial Intelligence. Springer, New Delhi. https://doi.org/10.1007/978-81-322-3972-7_9

Download citation

  • DOI: https://doi.org/10.1007/978-81-322-3972-7_9

  • Published:

  • Publisher Name: Springer, New Delhi

  • Print ISBN: 978-81-322-3970-3

  • Online ISBN: 978-81-322-3972-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics