SSS* and Dual*
Home * Search * SSS* and Dual*
SSS*,
a bestfirst state space search developed in 1977 by George C. Stockman for the linguistic analysis of waveforms ^{[2]} . In a later paper, Stockman (1979) ^{[3]} showed how to use this algorithm to determine the minimax value of game trees. SSS* and it counterpart Dual* are nondirectional algorithms for searching AND/OR graphs in a bestfirst manner similar to A*. They expand multiple paths of the search graph and retain global information about the search space, and search fewer nodes than AlphaBeta in fixeddepth minimax tree search.
The algorithm was examined and improved by various researchers: Igor Roizen and Judea Pearl ^{[4]}, Toshihide Ibaraki ^{[5]} , Tony Marsland et al., Subir Bhattacharya and Amitava Bagchi, Alexander Reinefeld, Aske Plaat, Jonathan Schaeffer, Wim Pijls and Arie de Bruin. In 1988 Burkhard Monien and Oliver Vornberger compared parallel AlphaBeta with parallel SSS* ^{[6]} and in 1990, HansJoachim Kraas wrote his Ph.D thesis about how to parallelize SSS* ^{[7]} .
However, it turned out the algorithmic overhead was too big to pay off the saved nodes. Aske Plaat, Jonathan Schaeffer, Wim Pijls and Arie de Bruin pointed out, that SSS* can be reformulated as a sequence of alphabeta null window calls with a Transposition Table.
Contents
SSS*
Aske Plaat et al. about SSS*:
In 1979 Stockman introduced SSS*, which looked like a radically different approach from AlphaBeta for searching fixeddepth minimax trees. It builds a tree in a socalled bestfirst fashion by visiting the most promising nodes first. AlphaBeta, in contrast, uses a depthfirst, lefttoright traversal of the tree. Intuitively, it would seem that a bestfirst strategy should prevail over a rigidly ordered depthfirst one. Stockman proved that SSS* dominated AlphaBeta; it would never evaluate more leaf nodes than AlphaBeta. Numerous simulations have shown that on average SSS* evaluates considerably fewer leaf nodes. Why, then, has the algorithm been shunned by practitioners?
Alexander Reinefeld has a good answer, and explanation of SSS* in his Lecture ^{[8]}:
SSS* maintains an OPEN list with descriptors of the active nodes. Descriptors are sorted in decreasing order of their merit (h values). A descriptor (n, s, h) consists of

a node identifier
n 
a status
s
LIVE: n is still unexpanded and h is an upper bound on the true value

SOLVED: h is the true value


a merit
h
SSS*'s two search phases:

Node Expansion Phase: Top down expansion of a MIN strategy.

Solution Phase: Bottom up search for the best MAX strategy.
Pseudo CCode
int SSS* (node n; int bound) { push (n, LIVE, bound); while ( true ) { pop (node); switch ( node.status ) { case LIVE: if (node == LEAF) insert (node, SOLVED, min(eval(node),h)); if (node == MIN_NODE) push (node.1, LIVE, h); if (node == MAX_NODE) for (j=w; j; j) push (node.j, LIVE, h); break; case SOLVED: if (node == ROOT_NODE) return (h); if (node == MIN_NODE) { purge (parent(node)); push (parent(node), SOLVED, h); } if (node == MAX_NODE) { if (node has an unexamined brother) push (brother(node), LIVE, h); else push (parent(node), SOLVED, h); } break; } } }
SSS* is too complex and too slow!

In each step, the node with the maximum hvalue is removed from OPEN.

Whenever an interior MAXnode gets SOLVED, all direct and indirect descendants must be purged from OPEN
These two steps alone take 90% of the CPU time!
Aske Plaat et al. continue:
SSS*, as formulated by Stockman, has several problems. First, it takes considerable effort to understand how the algorithm works, and still more to understand its relation to AlphaBeta. Second, SSS* maintains a data structure known as the OPEN list, similar to that found in singleagent search algorithms like A*. The size of this list grows exponentially with the depth of the search tree. This has led many authors to conclude that SSS* is effectively disqualified from being useful for real applications like gameplaying programs. Third, the OPEN list must be kept in sorted order. Insert and (in particular) delete/purge operations on the OPEN list can dominate the execution time of any program using SSS*. Despite the promise of expanding fewer nodes, the disadvantages of SSS* have proven a significant deterrent in practice.
Quote by Judea Pearl 1984 ^{[9]} :
The meager improvement in the pruning power of SSS* is more than offset by the increased storage space and bookkeeping (e.g. sorting OPEN) that it requires. One can safely speculate therefore that alphabeta will continue to monopolize the practice of computerized game playing.
Dual*
Dual* is the dual counterpart of SSS* by Tony Marsland et al ^{[10]} . The dual version of SSS* can be created by inverting SSS*’s operations: use an ascendingly sorted list instead of descending, swap max and min operations, and start at oo instead of +oo.
RecSSS* and RecDual*
The development of recursive variants was driven by the need for a better understanding of SSS*'s node expansion process and by the demand for more efficient implementations. Two recursive variants were proposed 1990: RecSSS* ^{[11]} by Subir Bhattacharya and Amitava Bagchi and SSS2 ^{[12]} by Wim Pijls and Arie de Bruin. In 1993 Alexander Reinefeld improved RecSSS*, making it both faster and more space efficient, using an OPENarray rather than dynamic list structures ^{[13]} .
Pseudo CCode
based on Alexander Reinefeld's Pascal pseudo code:
int RecSS*(nodeType n) { if (n is leaf) { s(n) = SOLVED; return min (evaluate(n), h(n)); /* evaluate leaf */ } if ( s(n) == UNEXPANDED ) { /* first descend? */ s(n) = LIVE; /* expand n */ for (i = 1 to width) if ( n.i is leaf ) insert (n.i, UNEXPANDED, h(n)); /* insert sons (= MIN leaves) of n */ else insert (n.i.1, UNEXPANDED, h(n)); /* insert left grandsons of n */ } g = highest hvalued grandson (or son) of n in OPEN; while ( h(g) == h(n) && status(g) != SOLVED ) { h(g) = RecSS*(g); /* get new upper bound */ if ( s(g) == SOLVED && g has a right brother ) replace g by (brother(g), UNDEXPANDED, h(g)); /* next brother of g */ g = highest hvalued grandson (or son) of n in OPEN; /*resolve ties in lexicographical order*/ } if ( s(g) == SOLVED ) s(n) = SOLVED; return h(g); }
int main() { insert (root, UNEXPANDED, oo); do h = RecSSS*(root); while ( s(n) != SOLVED ); }
SSS* and Dual* as MT
Aske Plaat, Jonathan Schaeffer, Wim Pijls and Arie de Bruin proved with their Memory Test framework, that both SSS* and Dual* can be reformulated as a sequence of alphabeta null window calls with a Transposition Table ^{[14]} :
Pseudo CCode
int MTSSS*( n ) { g := +oo; do { G := g; g := AlphaBeta(n, G1, G ); } while (g != G); return g; }
int MTDUAL*(n) { g := oo; do { G := g; g := AlphaBeta(n, G, G+1 ); } while (g != G); return g; }
At the 8th Advances in Computer Chess conference 1996, SSS* was finally declared "dead" by Wim Pijls and Arie de Bruin ^{[15]} ^{[16]} .
See also
Publications
1979
 George Stockman (1979). A Minimax Algorithm Better than AlphaBeta? Artificial Intelligence, Vol. 12, No. 2.
1980 ...
 Igor Roizen, Judea Pearl (1983). A Minimax Algorithm Better than AlphaBeta? Yes and No. Artificial Intelligence, Vol. 21
 Murray Campbell, Tony Marsland (1983). A Comparison of Minimax Tree Search Algorithms. Artificial Intelligence, Vol. 20, No. 4, pp. 347367. ISSN 00043702, pdf
 Nanda Srimani (1985). A New Algorithm (PS*) for Searching Game Trees. Master's thesis, University of Alberta
 Daniel B. Leifker, Laveen N. Kanal (1985). A Hybrid SSS*/AlphaBeta Algorithm for Parallel Search of Game Trees. IJCAI'85
 Toshihide Ibaraki (1986). Generalization of AlphaBeta and SSS* Search Procedures. Artificial Intelligence, Vol. 29
 Tony Marsland, Nanda Srimani (1986). Phased State Search. Fall Joint Computer Conference, pdf
 Tony Marsland, Alexander Reinefeld, Jonathan Schaeffer (1987). Low Overhead Alternatives to SSS*. Artificial Intelligence, Vol. 31, No. 2, pp. 185199. ISSN 00043702.
 Burkhard Monien, Oliver Vornberger (1988). Parallel AlphaBeta versus Parallel SSS*. Proc. of the IFIP WG 10.3 Working Conference on Distributed Processing, North Holland
 Yoshiroh Katoh, Toshihide Ibaraki (1988). Game Solving Procedure SSS* is Unsurpassed. Systems and Computers in Japan, Vol. 19, No. 7
1990 ...
 Subir Bhattacharya, Amitava Bagchi (1990). Unified Recursive Schemes for Search in Game Trees. Technical Report WPS144, Indian Institute of Management, Calcutta
 HansJoachim Kraas (1990).Zur Parallelisierung des SSS*Algorithmus. Ph.D thesis, TU Braunschweig (German)
 Wim Pijls, Arie de Bruin (1990). Another View on the SSS* Algorithm. International Symposium SIGAL '90
 Claude G. Diderich (1992). Evaluation des performance de l'algorithme SSS* avec phases de synchronisation sur une machine parallèle à mémoires distribées. Technical report LITH99, Swiss Federal Institute of Technology, Computer Science Theory Laboratory, Lausanne, Switzerland (French)
 Wim Pijls, Arie de Bruin (1993). SSS*like Algorithms in Constrained Memory. ICCA Journal, Vol. 16, No. 1
 Alexander Reinefeld (1994). A Minimax Algorithm Faster than AlphaBeta. Advances in Computer Chess 7
 Aske Plaat, Jonathan Schaeffer, Wim Pijls, Arie de Bruin (1994, 2014). SSS* = AlphaBeta + TT. TRCS9417, University of Alberta, arXiv:1404.1517
 Alexander Reinefeld, Peter Ridinger (1994). TimeEfficient State Space Search. Artificial Intelligence, Vol. 71, No. 2, CiteSeerX
 Aske Plaat, Jonathan Schaeffer, Wim Pijls, Arie de Bruin (1996). An Algorithm Faster than NegaScout and SSS* in Practice. pdf from CiteSeerX » MTD(f)
 Arie de Bruin, Wim Pijls (1997). SSS†. Advances in Computer Chess 8
2000 ...
2010 ...
 Bojun Huang (2015). Pruning Game Tree by Rollouts. AAAI » MTSSS*, MCTS ^{[17]}
External Links
 SSS* from Wikipedia
 State space search from Wikipedia
 Aziza Mustafa Zadeh  Stars Dance, Leverkusener Jazztage, November 7, 2006, 3sat, YouTube Video
References
 ↑ M. C. Escher, Stars, 1948, Picture gallery "Back in Holland 1941  1954" from The Official M.C. Escher Website
 ↑ George Stockman, Laveen Kanal (1983). Problem Reduction Representation for the Linguistic Analysis of Waveforms. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 5, No 3
 ↑ George Stockman (1979). A Minimax Algorithm Better than AlphaBeta? Artificial Intelligence, Vol. 12, No. 2
 ↑ Igor Roizen, Judea Pearl (1983). A Minimax Algorithm Better than AlphaBeta? Yes and No. Artificial Intelligence, Vol. 21
 ↑ Toshihide Ibaraki (1986). Generalization of AlphaBeta and SSS* Search Procedures. Artificial Intelligence, Vol. 29
 ↑ Burkhard Monien, Oliver Vornberger (1988). Parallel AlphaBeta versus Parallel SSS*. Proc. of the IFIP WG 10.3 Working Conference on Distributed Processing, North Holland
 ↑ HansJoachim Kraas (1990). Zur Parallelisierung des SSS*Algorithmus. Ph.D thesis, TU Braunschweig (German)
 ↑ Alexander Reinefeld (2005). Die Entwicklung der Spielprogrammierung: Von John von Neumann bis zu den hochparallelen Schachmaschinen. slides as pdf, Themen der Informatik im historischen Kontext Ringvorlesung an der HU Berlin, 02.06.2005 (English paper, German title)
 ↑ Judea Pearl (1984). Heuristics: Intelligent Search Strategies for Computer Problem Solving. AddisonWesley Publishers Co., Reading, MA. ISBN 0201055945.
 ↑ Tony Marsland, Alexander Reinefeld, Jonathan Schaeffer (1987). Low Overhead Alternatives to SSS*. Artificial Intelligence, Vol. 31, No. 2, pp. 185199. ISSN 00043702.
 ↑ Subir Bhattacharya, Amitava Bagchi (1990). Unified Recursive Schemes for Search in Game Trees. Technical Report WPS144, Indian Institute of Management, Calcutta
 ↑ Wim Pijls, Arie de Bruin (1990). Another View on the SSS* Algorithm. International Symposium SIGAL '90
 ↑ Alexander Reinefeld (1994). A Minimax Algorithm Faster than AlphaBeta. Advances in Computer Chess 7
 ↑ Aske Plaat, Jonathan Schaeffer, Wim Pijls, Arie de Bruin (1994, 2014). SSS* = AlphaBeta + TT. TRCS9417, University of Alberta, arXiv:1404.1517
 ↑ Arie de Bruin, Wim Pijls (1997). SSS†. Advances in Computer Chess 8
 ↑ Aske Plaat, Jonathan Schaeffer, Wim Pijls, Arie de Bruin(1996). BestFirst FixedDepth Minimax Algorithms. Artificial Intelligence, Vol. 87
 ↑ Re: Announcing lczero by Daniel Shawul, CCC, January 21, 2018 » Leela Chess Zero