SSS* and Dual*

From Chessprogramming wiki
Jump to: navigation, search

Home * Search * SSS* and Dual*

M. C. Escher, Stars, 1948 [1]

a best-first state space search developed in 1977 by George C. Stockman for the linguistic analysis of waveforms [2] . In a later paper Stockman (1979) [3] showed how to use this algorithm to determine the minimax value of game trees. SSS* and it counterpart Dual* are non-directional algorithms for searching AND/OR graphs in a best-first manner similar to A*. They expand multiple paths of the search graph and retain global information about the search space, and search fewer nodes than Alpha-Beta in fixed-depth minimax tree search.

The algorithm was examined and improved by various researchers: Igor Roizen and Judea Pearl [4] , Toshihide Ibaraki [5] , Tony Marsland et al., Subir Bhattacharya and Amitava Bagchi, Alexander Reinefeld, Aske Plaat, Jonathan Schaeffer, Wim Pijls and Arie de Bruin. In 1988 Burkhard Monien and Oliver Vornberger compared parallel Alpha-Beta with parallel SSS* [6] and in 1990, Hans-Joachim Kraas wrote his Ph.D thesis about how to parallelize SSS* [7] .

However, it turned out the algorithmic overhead was too big to pay off the saved nodes. Aske Plaat, Jonathan Schaeffer, Wim Pijls and Arie de Bruin pointed out, that SSS* can be reformulated as a sequence of alpha-beta null window calls with a Transposition Table.


Aske Plaat et al. about SSS*:

In 1979 Stockman introduced SSS*, which looked like a radically different approach from Alpha-Beta for searching fixed-depth minimax trees. It builds a tree in a so-called best-first fashion by visiting the most promising nodes first. Alpha-Beta, in contrast, uses a depth-first, left-to-right traversal of the tree. Intuitively, it would seem that a best-first strategy should prevail over a rigidly ordered depth-first one. Stockman proved that SSS* dominated Alpha-Beta; it would never evaluate more leaf nodes than Alpha-Beta. Numerous simulations have shown that on average SSS* evaluates considerably fewer leaf nodes. Why, then, has the algorithm been shunned by practitioners?

Alexander Reinefeld has a good answer, and explanation of SSS* in his Lecture [8]:

SSS* maintains an OPEN list with descriptors of the active nodes. Descriptors are sorted in decreasing order of their merit (h values). A descriptor (n, s, h) consists of
  • a node identifier n
  • a status s
    • LIVE: n is still unexpanded and h is an upper bound on the true value
    • SOLVED: h is the true value
  • a merit h
SSS*'s two search phases:
  1. Node Expansion Phase: Top down expansion of a MIN strategy.
  2. Solution Phase: Bottom up search for the best MAX strategy.

Pseudo C-Code

int SSS* (node n; int bound)
   push (n, LIVE, bound);
   while ( true ) {
      pop (node);
      switch ( node.status ) {
      case LIVE:
          if (node == LEAF)
             insert (node, SOLVED, min(eval(node),h));
          if (node == MIN_NODE)
             push (node.1, LIVE, h);
          if (node == MAX_NODE)
             for (j=w; j; j--)
                push (node.j, LIVE, h);
      case SOLVED:
          if (node == ROOT_NODE)
             return (h);
          if (node == MIN_NODE) {
              purge (parent(node));
              push (parent(node), SOLVED, h);
          if (node == MAX_NODE) {
             if (node has an unexamined brother)
                push (brother(node), LIVE, h);
                push (parent(node), SOLVED, h);
SSS* is too complex and too slow!
  • In each step, the node with the maximum h-value is removed from OPEN.
  • Whenever an interior MAX-node gets SOLVED, all direct and indirect descendants must be purged from OPEN

These two steps alone take 90% of the CPU time!

Aske Plaat et al. continue:

SSS*, as formulated by Stockman, has several problems. First, it takes considerable effort to understand how the algorithm works, and still more to understand its relation to Alpha-Beta. Second, SSS* maintains a data structure known as the OPEN list, similar to that found in single-agent search algorithms like A*. The size of this list grows exponentially with the depth of the search tree. This has led many authors to conclude that SSS* is effectively disqualified from being useful for real applications like game-playing programs. Third, the OPEN list must be kept in sorted order. Insert and (in particular) delete/purge operations on the OPEN list can dominate the execution time of any program using SSS*. Despite the promise of expanding fewer nodes, the disadvantages of SSS* have proven a significant deterrent in practice.

Quote by Judea Pearl 1984 [9] :

The meager improvement in the pruning power of SSS* is more than offset by the increased storage space and bookkeeping (e.g. sorting OPEN) that it requires. One can safely speculate therefore that alphabeta will continue to monopolize the practice of computerized game playing.


Dual* is the dual counterpart of SSS* by Tony Marsland et al [10] . The dual version of SSS* can be created by inverting SSS*’s operations: use an ascendingly sorted list instead of descending, swap max and min operations, and start at -oo instead of +oo.

RecSSS* and RecDual*

The development of recursive variants was driven by the need for a better understanding of SSS*'s node expansion process and by the demand for more efficient implementations. Two recursive variants were proposed 1990: RecSSS* [11] by Subir Bhattacharya and Amitava Bagchi and SSS-2 [12] by Wim Pijls and Arie de Bruin. In 1993 Alexander Reinefeld improved RecSSS*, making it both faster and more space efficient, using an OPEN-array rather than dynamic list structures [13] .

Pseudo C-Code

based on Alexander Reinefeld's Pascal pseudo code:

int RecSS*(nodeType n)
   if (n is leaf) {
      s(n) = SOLVED;
      return min (evaluate(n), h(n));                                            /* evaluate leaf */
   if ( s(n) == UNEXPANDED ) {                                                  /* first descend? */
      s(n) = LIVE;                                                                    /* expand n */
      for (i = 1 to width)
         if ( n.i is leaf )
            insert (n.i, UNEXPANDED, h(n));                    /* insert sons (= MIN leaves) of n */
            insert (n.i.1, UNEXPANDED, h(n));                       /* insert left grandsons of n */
   g = highest h-valued grandson (or son) of n in OPEN;
   while ( h(g) == h(n) && status(g) != SOLVED ) {
      h(g) = RecSS*(g);                                                    /* get new upper bound */
      if ( s(g) == SOLVED && g has a right brother )
          replace g by (brother(g), UNDEXPANDED, h(g));                      /* next brother of g */
      g = highest h-valued grandson (or son) of n in OPEN; /*resolve ties in lexicographical order*/
   if ( s(g) == SOLVED )
      s(n) = SOLVED;
   return h(g);
int main()
   insert (root, UNEXPANDED, oo);
      h = RecSSS*(root);
   while ( s(n) != SOLVED );

SSS* and Dual* as MT

Aske Plaat, Jonathan Schaeffer, Wim Pijls and Arie de Bruin proved with their Memory Test framework, that both SSS* and Dual* can be reformulated as a sequence of alpha-beta null window calls with a Transposition Table [14] :

Pseudo C-Code

int MT-SSS*( n )
   g := +oo;
   do {
      G := g;
      g := Alpha-Beta(n, G-1, G );
   } while (g != G);
   return g;
int MT-DUAL*(n)
   g := -oo;
   do {
      G := g;
      g := Alpha-Beta(n, G, G+1 );
   } while (g != G);
   return g;

At the 8th Advances in Computer Chess conference 1996, SSS* was finally declared "dead" by Wim Pijls and Arie de Bruin [15] [16] .

See also



1980 ...

1990 ...

2000 ...

2010 ...

External Links


  1. M. C. Escher, Stars, 1948, Picture gallery "Back in Holland 1941 - 1954" from The Official M.C. Escher Website
  2. George Stockman, Laveen Kanal (1983). Problem Reduction Representation for the Linguistic Analysis of Waveforms. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 5, No 3
  3. George Stockman (1979). A Minimax Algorithm Better than Alpha-Beta? Artificial Intelligence, Vol. 12, No. 2
  4. Igor Roizen, Judea Pearl (1983). A Minimax Algorithm Better than Alpha-Beta? Yes and No. Artificial Intelligence, Vol. 21, pp. 199-230. ISSN 0004-3702
  5. Toshihide Ibaraki, (1986). Generalizations of alpha-beta and SSS* search procedures.Artificial Intelligence, 29, 73-117
  6. Burkhard Monien, Oliver Vornberger (1988). Parallel Alpha-Beta versus Parallel SSS*. Proc. of the IFIP WG 10.3 Working Conference on Distributed Processing, North Holland
  7. Hans-Joachim Kraas (1990). Zur Parallelisierung des SSS*-Algorithmus. Ph.D thesis, TU Braunschweig (German)
  8. Alexander Reinefeld (2005). Die Entwicklung der Spielprogrammierung: Von John von Neumann bis zu den hochparallelen Schachmaschinen. slides as pdf, Themen der Informatik im historischen Kontext Ringvorlesung an der HU Berlin, 02.06.2005 (English paper, German title)
  9. Judea Pearl (1984). Heuristics: Intelligent Search Strategies for Computer Problem Solving. Addison-Wesley Publishers Co., Reading, MA. ISBN 0-201-05594-5.
  10. Tony Marsland, Alexander Reinefeld, Jonathan Schaeffer (1987). Low Overhead Alternatives to SSS*. Artificial Intelligence, Vol. 31, No. 2, pp. 185-199. ISSN 0004-3702.
  11. Subir Bhattacharya, Amitava Bagchi (1990). Unified Recursive Schemes for Search in Game Trees. Technical Report WPS-144, Indian Institute of Management, Calcutta
  12. Wim Pijls, Arie de Bruin (1990). Another View on the SSS* Algorithm. International Symposium SIGAL '90
  13. Alexander Reinefeld (1994). A Minimax Algorithm Faster than Alpha-Beta. Advances in Computer Chess 7
  14. Aske Plaat, Jonathan Schaeffer, Wim Pijls, Arie de Bruin (1994, 2014). SSS* = Alpha-Beta + TT. TR-CS-94-17, University of Alberta, arXiv:1404.1517
  15. Arie de Bruin, Wim Pijls (1997). SSS†. Advances in Computer Chess 8
  16. Aske Plaat, Jonathan Schaeffer, Wim Pijls, Arie de Bruin(1996). Best-First Fixed-Depth Minimax Algorithms. Artificial Intelligence, Vol. 87
  17. Re: Announcing lczero by Daniel Shawul, CCC, January 21, 2018 » Leela Chess Zero

Up one level