SSS* and Dual*

Home * Search * SSS* and Dual*

SSS*, a best-first state space search developed in 1977 by George C. Stockman for the linguistic analysis of waveforms. In a later paper Stockman (1979) showed how to use this algorithm to determine the minimax value of game trees. SSS* and it counterpart Dual* are non-directional algorithms for searching AND/OR graphs in a best-first manner similar to A*. They expand multiple paths of the search graph and retain global information about the search space, and search fewer nodes than Alpha-Beta in fixed-depth minimax tree search.

The algorithm was examined and improved by various researchers: Igor Roizen and Judea Pearl, Toshihide Ibaraki , Tony Marsland et al., Subir Bhattacharya and Amitava Bagchi, Alexander Reinefeld, Aske Plaat, Jonathan Schaeffer, Wim Pijls and Arie de Bruin. In 1988 Burkhard Monien and Oliver Vornberger compared parallel Alpha-Beta with parallel SSS* and in 1990, Hans-Joachim Kraas wrote his Ph.D thesis about how to parallelize SSS*.

However, it turned out the algorithmic overhead was too big to pay off the saved nodes. Aske Plaat, Jonathan Schaeffer, Wim Pijls and Arie de Bruin pointed out, that SSS* can be reformulated as a sequence of alpha-beta null window calls with a Transposition Table.

=SSS*= Aske Plaat et al. about SSS*:

In 1979 Stockman introduced SSS*, which looked like a radically different approach from Alpha-Beta for searching fixed-depth minimax trees. It builds a tree in a so-called best-first fashion by visiting the most promising nodes first. Alpha-Beta, in contrast, uses a depth-first, left-to-right traversal of the tree. Intuitively, it would seem that a best-first strategy should prevail over a rigidly ordered depth-first one. Stockman proved that SSS* dominated Alpha-Beta; it would never evaluate more leaf nodes than Alpha-Beta. Numerous simulations have shown that on average SSS* evaluates considerably fewer leaf nodes. Why, then, has the algorithm been shunned by practitioners?

Alexander Reinefeld has a good answer, and explanation of SSS* in his Lecture :

SSS* maintains an OPEN list with descriptors of the active nodes. Descriptors are sorted in decreasing order of their merit (h values). A descriptor (n, s, h) consists of
 * n
 * s
 * h
 * h
 * h

SSS*'s two search phases:

Pseudo C-Code
int SSS* (node n; int bound) {  push (n, LIVE, bound); while ( true ) { pop (node); switch ( node.status ) { case LIVE: if (node == LEAF) insert (node, SOLVED, min(eval(node),h)); if (node == MIN_NODE) push (node.1, LIVE, h); if (node == MAX_NODE) for (j=w; j; j--) push (node.j, LIVE, h); break; case SOLVED: if (node == ROOT_NODE) return (h); if (node == MIN_NODE) { purge (parent(node)); push (parent(node), SOLVED, h); }         if (node == MAX_NODE) { if (node has an unexamined brother) push (brother(node), LIVE, h); else push (parent(node), SOLVED, h); }         break; }  } } SSS* is too complex and too slow! These two steps alone take 90% of the CPU time!

Aske Plaat et al. continue:

SSS*, as formulated by Stockman, has several problems. First, it takes considerable effort to understand how the algorithm works, and still more to understand its relation to Alpha-Beta. Second, SSS* maintains a data structure known as the OPEN list, similar to that found in single-agent search algorithms like A*. The size of this list grows exponentially with the depth of the search tree. This has led many authors to conclude that SSS* is effectively disqualified from being useful for real applications like game-playing programs. Third, the OPEN list must be kept in sorted order. Insert and (in particular) delete/purge operations on the OPEN list can dominate the execution time of any program using SSS*. Despite the promise of expanding fewer nodes, the disadvantages of SSS* have proven a significant deterrent in practice.

Quote by Judea Pearl 1984 :

The meager improvement in the pruning power of SSS* is more than offset by the increased storage space and bookkeeping (e.g. sorting OPEN) that it requires. One can safely speculate therefore that alphabeta will continue to monopolize the practice of computerized game playing.

=Dual*= Dual* is the dual counterpart of SSS* by Tony Marsland et al. The dual version of SSS* can be created by inverting SSS*’s operations: use an ascendingly sorted list instead of descending, swap max and min operations, and start at -oo instead of +oo.

=RecSSS* and RecDual*= The development of recursive variants was driven by the need for a better understanding of SSS*'s node expansion process and by the demand for more efficient implementations. Two recursive variants were proposed 1990: RecSSS* by Subir Bhattacharya and Amitava Bagchi and SSS-2 by Wim Pijls and Arie de Bruin. In 1993 Alexander Reinefeld improved RecSSS*, making it both faster and more space efficient, using an OPEN-array rather than dynamic list structures.

Pseudo C-Code
based on Alexander Reinefeld's Pascal pseudo code: int RecSS*(nodeType n) { if (n is leaf) { s(n) = SOLVED; return min (evaluate(n), h(n));                                           /* evaluate leaf */ }  if ( s(n) == UNEXPANDED ) {                                                  /* first descend? */     s(n) = LIVE;                                                                    /* expand n */ for (i = 1 to width) if ( n.i is leaf ) insert (n.i, UNEXPANDED, h(n));                   /* insert sons (= MIN leaves) of n */ else insert (n.i.1, UNEXPANDED, h(n));                      /* insert left grandsons of n */ }  g = highest h-valued grandson (or son) of n in OPEN; while ( h(g) == h(n) && status(g) != SOLVED ) { h(g) = RecSS*(g);                                                   /* get new upper bound */ if ( s(g) == SOLVED && g has a right brother ) replace g by (brother(g), UNDEXPANDED, h(g));                     /* next brother of g */ g = highest h-valued grandson (or son) of n in OPEN; /*resolve ties in lexicographical order*/ }  if ( s(g) == SOLVED ) s(n) = SOLVED; return h(g); }

int main {  insert (root, UNEXPANDED, oo); do     h = RecSSS*(root); while ( s(n) != SOLVED ); }

=SSS* and Dual* as MT= Aske Plaat, Jonathan Schaeffer, Wim Pijls and Arie de Bruin proved with their Memory Test framework, that both SSS* and Dual* can be reformulated as a sequence of alpha-beta null window calls with a Transposition Table :

Pseudo C-Code
int MT-SSS*( n ) {  g := +oo; do { G := g;     g := Alpha-Beta(n, G-1, G ); } while (g != G); return g; }

int MT-DUAL*(n) {  g := -oo; do { G := g;     g := Alpha-Beta(n, G, G+1 ); } while (g != G); return g; } At the 8th Advances in Computer Chess conference 1996, SSS* was finally declared "dead" by Wim Pijls and Arie de Bruin.

=See also=
 * MTD(f)
 * NegaC*
 * NegaScout
 * Scout

=Publications=

1979

 * George Stockman (1979). A Minimax Algorithm Better than Alpha-Beta? Artificial Intelligence, Vol. 12, No. 2.

1980 ...

 * Igor Roizen, Judea Pearl (1983). A Minimax Algorithm Better than Alpha-Beta? Yes and No. Artificial Intelligence, Vol. 21
 * Murray Campbell, Tony Marsland (1983). A Comparison of Minimax Tree Search Algorithms. Artificial Intelligence, Vol. 20, No. 4, pp. 347-367. ISSN 0004-3702, pdf
 * Nanda Srimani (1985). A New Algorithm (PS*) for Searching Game Trees. Master's thesis, University of Alberta
 * Daniel B. Leifker, Laveen N. Kanal (1985). A Hybrid SSS*/Alpha-Beta Algorithm for Parallel Search of Game Trees. IJCAI'85
 * Toshihide Ibaraki (1986). Generalizations of alpha-beta and SSS* search procedures. Artificial Intelligence, 29, 73-117
 * Tony Marsland, Nanda Srimani (1986). Phased State Search. Fall Joint Computer Conference, pdf
 * Tony Marsland, Alexander Reinefeld, Jonathan Schaeffer (1987). Low Overhead Alternatives to SSS*. Artificial Intelligence, Vol. 31, No. 2, pp. 185-199. ISSN 0004-3702.
 * Burkhard Monien, Oliver Vornberger (1988). Parallel Alpha-Beta versus Parallel SSS*. Proc. of the IFIP WG 10.3 Working Conference on Distributed Processing, North Holland

1990 ...

 * Subir Bhattacharya, Amitava Bagchi (1990). Unified Recursive Schemes for Search in Game Trees. Technical Report WPS-144, Indian Institute of Management, Calcutta
 * Hans-Joachim Kraas (1990).Zur Parallelisierung des SSS*-Algorithmus. Ph.D thesis, TU Braunschweig (German)
 * Wim Pijls, Arie de Bruin (1990). Another View on the SSS* Algorithm. International Symposium SIGAL '90 (eds. T. Asano, T. Ibaraki, H. Imai, and T. Nishizeki), pp. 211-220. Tokyo, Japan. Lecture Notes in Computer Science, Vol. 450, Springer-Verlag, New York, NY. ISSN 0302-9743.
 * Claude G. Diderich (1992). Evaluation des performance de l'algorithme SSS* avec phases de synchronisation sur une machine parallèle à mémoires distribées. Technical report LITH-99, Swiss Federal Institute of Technology, Computer Science Theory Laboratory, Lausanne, Switzerland (French)
 * Alexander Reinefeld (1994). A Minimax Algorithm Faster than Alpha-Beta. Advances in Computer Chess 7
 * Aske Plaat, Jonathan Schaeffer, Wim Pijls, Arie de Bruin (1994). SSS* = a-b TT. TR-CS-94-17, University of Alberta
 * Alexander Reinefeld, Peter Ridinger (1994). Time-Efficient State Space Search. Artificial Intelligence, Vol. 71, No. 2, CiteSeerX
 * Aske Plaat, Jonathan Schaeffer, Wim Pijls, Arie de Bruin (1996). An Algorithm Faster than NegaScout and SSS* in Practice, pdf from CiteSeerX, covers MTD(f)
 * Arie de Bruin, Wim Pijls (1997). SSS†. Advances in Computer Chess 8
 * Aske Plaat, Arie de Bruin, Jonathan Schaeffer, Wim Pijls (1999). A Minimax Algorithm better than SSS*. Artificial Intelligence, Vol. 87

2010 ...

 * Bojun Huang (2015). Pruning Game Tree by Rollouts. AAAI » MT-SSS*, MCTS

=External Links=
 * SSS* From Wikipedia
 * State space search From Wikipedia
 * State Space Search from Murray's Web Pages
 * Aziza Mustafa Zadeh - Stars Dance, Leverkusener Jazztage, November 7, 2006, 3sat, YouTube Video

=References=

Up one level