Changes

Jump to: navigation, search

NNUE

3 bytes added, 22:26, 27 July 2020
no edit summary
The neural network consists of four layers, W1 through W4. The input layer W1 is heavily overparametrized, feeding in the [[Board Representation|board representation]] for various king and pawn configurations.
The efficiency of the net is due to [[Incremental Updates|incremental update]] of W1 in [[Make Move|make]] and [[Unmake Move|unmake move]],
where only a fractions fraction of its neurons need to be recalculated. The remaining three layers with 256x2-32-32 channels are computational less expensive, best calculated using appropriate [[SIMD and SWAR Techniques|SIMD instructions]] like [[AVX2]] on [[x86-64]], or if available, [[AVX-512]].
[[FILE:NNUE.jpg|none|border|text-bottom]]

Navigation menu