The tree-search version of A* is optimal if h(n) is admissible, while the graph-search version is optimal if h(n) is consistent.

First step establish: if h(n) is consistent, then the values of f(n) along any path are nondecreasing. Proof follows directly from the definition of consistency.

Δ Suppose n' is a successor of n; then g(n') = g(n) + c(n,a,n') for some action a, and we have:

f(n') = g(n') + h(n') = g(n) + c(n,a,n') + h(n') ≥ g(n) + h(n) = f(n)

Δ Next step is to prove that whenever A* selects a node n for expansion, the optimal path to that node has been found. Were this not the case, there would have to be another frontier node n' on the optimal path from the start node to n, by the graph separation property because f is nondecreasing along any path, n' would have lower f-cost than n and would have been selected first.

Δ The sequence of nodes expanded by A* using GRAPH-SEARCH is in nondecreasing order of f(n). Hence, the first goal node selected for expansion must be an optimal solution because f is the true cost for goal nodes (which have h=0) and all later goal nodes will be at least as expensive.

f-costs can be drawn on the state space like contours in a map.

With uniform-cost search (A* search using h(n) = 0) the bands will be "circular" around the start state. More accurate heuristics, the bands will stretch toward the goal state and become more narrowly focused around the optimal path.

Δ If C*is the cost of the optimal solution path, then we can say:

Δ A* expands all nodes with f(n) < C*.

Δ A* might then expand some of the nodes right on the "goal contour" (where f(n) = C*) before selecting a goal node.

Completeness requires that there be only finitely many nodes with cost less than or equal to C*, a condition that is true if all step costs exceed some finite ∈ and if b is finite.

A* expands no nodes with f(n) > C*.

~This path is pruned because the function of the heuristic was more than the total physical path cost.

A* is optimally efficient for any given consistent heuristic.

A* search is comlete, optimal, and optimally efficient. A catch is that for most problems the number of states within the goal contour search space is still exponential in the length of the solution.

Δ Constant steps: O(b^(∈d)) = O((b^∈)^{d}), so effective

Δ branching factor is b^∈.