Difference between revisions of "Ilya Loshchilov"
GerdIsenberg (talk | contribs) |
GerdIsenberg (talk | contribs) |
||
Line 7: | Line 7: | ||
He worked with [[Marc Schoenauer]] and [[Michèle Sebag]] at [[University of Paris#11|Paris-Sud 11 University]] and [https://en.wikipedia.org/wiki/French_Institute_for_Research_in_Computer_Science_and_Automation INRIA] within their ''TAO Project Team'' <ref>[https://tao.lri.fr/tiki-index.php?page=People TikiWiki | People]</ref> on [https://en.wikipedia.org/wiki/Multi-objective_optimization multi-objective optimization], | He worked with [[Marc Schoenauer]] and [[Michèle Sebag]] at [[University of Paris#11|Paris-Sud 11 University]] and [https://en.wikipedia.org/wiki/French_Institute_for_Research_in_Computer_Science_and_Automation INRIA] within their ''TAO Project Team'' <ref>[https://tao.lri.fr/tiki-index.php?page=People TikiWiki | People]</ref> on [https://en.wikipedia.org/wiki/Multi-objective_optimization multi-objective optimization], | ||
[https://en.wikipedia.org/wiki/Adaptive_coordinate_descent adaptive coordinate descent] and [https://en.wikipedia.org/wiki/CMA-ES CMA-ES], where he defended his Ph.D. with the title ''Surrogate-Assisted Evolutionary Algorithms'' in 2013 <ref>[[Ilya Loshchilov]] ('''2013'''). ''[http://loshchilov.com/phd.html Surrogate-Assisted Evolutionary Algorithms]''. Ph.D. thesis, [[University of Paris#11|Paris-Sud 11 University]], advisors [[Marc Schoenauer]] and [[Michèle Sebag]]</ref>. | [https://en.wikipedia.org/wiki/Adaptive_coordinate_descent adaptive coordinate descent] and [https://en.wikipedia.org/wiki/CMA-ES CMA-ES], where he defended his Ph.D. with the title ''Surrogate-Assisted Evolutionary Algorithms'' in 2013 <ref>[[Ilya Loshchilov]] ('''2013'''). ''[http://loshchilov.com/phd.html Surrogate-Assisted Evolutionary Algorithms]''. Ph.D. thesis, [[University of Paris#11|Paris-Sud 11 University]], advisors [[Marc Schoenauer]] and [[Michèle Sebag]]</ref>. | ||
− | In his thesis, he acknowledged ''Leonid Ivanovich Babak'' <ref>[https://4science.ru/person/Babak-Leonid-Ivanovich Бабак Леонид Иванович | Персона | 4science]</ref> from [https://en.wikipedia.org/wiki/Tomsk_State_University Tomsk State University] as his first scientific advisor who showed him the power of evolution, and his chess teacher ''Vladimir Andreevich Zhukov'' who showed him that there is the pleasure of thinking. As postdoc at [https://en.wikipedia.org/wiki/University_of_Freiburg University of Freiburg], he continued his research with [[Frank Hutter]] combining | + | In his thesis, he acknowledged ''Leonid Ivanovich Babak'' <ref>[https://4science.ru/person/Babak-Leonid-Ivanovich Бабак Леонид Иванович | Персона | 4science]</ref> from [https://en.wikipedia.org/wiki/Tomsk_State_University Tomsk State University] as his first scientific advisor who showed him the power of evolution, and his chess teacher ''Vladimir Andreevich Zhukov'' who showed him that there is the pleasure of thinking. As postdoc at [https://en.wikipedia.org/wiki/University_of_Freiburg University of Freiburg], he continued his research with [[Frank Hutter]], combining evolutionary algorithms with [[Deep Learning|deep learning]] techniques. |
=SGDR= | =SGDR= |
Latest revision as of 20:44, 13 September 2019
Home * People * Ilya Loshchilov
Ilya Gennadyevich Loshchilov,
a Russian computer scientist in the field of evolutionary algorithms (EA), optimization and machine learning.
He worked with Marc Schoenauer and Michèle Sebag at Paris-Sud 11 University and INRIA within their TAO Project Team [2] on multi-objective optimization,
adaptive coordinate descent and CMA-ES, where he defended his Ph.D. with the title Surrogate-Assisted Evolutionary Algorithms in 2013 [3].
In his thesis, he acknowledged Leonid Ivanovich Babak [4] from Tomsk State University as his first scientific advisor who showed him the power of evolution, and his chess teacher Vladimir Andreevich Zhukov who showed him that there is the pleasure of thinking. As postdoc at University of Freiburg, he continued his research with Frank Hutter, combining evolutionary algorithms with deep learning techniques.
SGDR
Loshchilov's and Hutter's stochastic gradient descent with warm restarts (SGDR) as introduced in their 2016 paper [5], is mentioned as method for learning rate scheduling to train Leela Chess' third party network Leelenstein [6], which is combined with Allie for the AllieStein chess playing entity.
Selected Publications
2010 ...
- Ilya Loshchilov, Marc Schoenauer, Michèle Sebag (2010). Dominance-Based Pareto-Surrogate for Multi-Objective Optimization. SEAL 2010, pdf
- Ilya Loshchilov, Marc Schoenauer, Michèle Sebag (2010). A mono surrogate for multiobjective optimization. GECCO 2010, pdf
- Ilya Loshchilov, Marc Schoenauer, Michèle Sebag (2011). Adaptive coordinate descent. GECCO 2011, pdf [9]
- Ilya Loshchilov, Marc Schoenauer, Michèle Sebag (2012). Self-Adaptive Surrogate-Assisted Covariance Matrix Adaptation Evolution Strategy. arXiv:1204.2356
- Ilya Loshchilov, Marc Schoenauer, Michèle Sebag (2012). Alternative Restart Strategies for CMA-ES. arXiv:1207.0206
- Ilya Loshchilov, Marc Schoenauer, Michèle Sebag (2013). KL-based Control of the Learning Schedule for Surrogate Black-Box Optimization. arXiv:1308.2655
- Ilya Loshchilov (2013). Surrogate-Assisted Evolutionary Algorithms. Ph.D. thesis, Paris-Sud 11 University, advisors Marc Schoenauer and Michèle Sebag
- Ilya Loshchilov (2013). CMA-ES with Restarts for Solving CEC 2013 Benchmark Problems. CEC 2013, pdf [10]
- Ilya Loshchilov (2014). A Computationally Efficient Limited Memory CMA-ES for Large Scale Optimization. arXiv:1404.5520
2015 ...
- Ilya Loshchilov (2015). LM-CMA: an Alternative to L-BFGS for Large Scale Black-box Optimization. arXiv:1511.00221 [11] [12]
- Ilya Loshchilov, Frank Hutter (2015). Online Batch Selection for Faster Training of Neural Networks. arXiv:1511.06343
- Ilya Loshchilov, Frank Hutter (2016). CMA-ES for Hyperparameter Optimization of Deep Neural Networks. arXiv:1604.07269
- Ilya Loshchilov, Frank Hutter (2016). SGDR: Stochastic Gradient Descent with Warm Restarts. arXiv:1608.03983
- Ilya Loshchilov, Tobias Glasmachers (2016). Anytime Bi-Objective Optimization with a Hybrid Multi-Objective CMA-ES (HMO-CMA-ES). GECCO 2016
- Ilya Loshchilov, Frank Hutter (2017). Decoupled Weight Decay Regularization. arXiv:1711.05101
- Ilya Loshchilov, Tobias Glasmachers, Hans-Georg Beyer (2017). Limited-Memory Matrix Adaptation for Large Scale Black-box Optimization. arXiv:1705.06693
- Patryk Chrabaszcz, Ilya Loshchilov, Frank Hutter (2018). Back to Basics: Benchmarking Canonical Evolution Strategies for Playing Atari. arXiv:1802.08842 [13] [14]
- Ilya Loshchilov, Tobias Glasmachers, Hans-Georg Beyer (2019). Large Scale Black-Box Optimization by Limited-Memory Matrix Adaptation. IEEE Transactions on Evolutionary Computation, Vol. 23, No. 2
External Links
- GitHub - loshchil/SGDR
- GitHub - loshchil/AdamW-and-SGDW: Decoupled Weight Decay Regularization (ICLR 2019)
References
- ↑ Ilya Loshchilov
- ↑ TikiWiki | People
- ↑ Ilya Loshchilov (2013). Surrogate-Assisted Evolutionary Algorithms. Ph.D. thesis, Paris-Sud 11 University, advisors Marc Schoenauer and Michèle Sebag
- ↑ Бабак Леонид Иванович | Персона | 4science
- ↑ Ilya Loshchilov, Frank Hutter (2016). SGDR: Stochastic Gradient Descent with Warm Restarts. arXiv:1608.03983
- ↑ Jjosh is creating Leelenstein | Patreon
- ↑ Ilya Loshchilov - Google Scholar Citations
- ↑ dblp: Ilya Loshchilov
- ↑ Adaptive coordinate descent from Wikipedia
- ↑ cec2013 - Benchmark functions for the CEC 2013 Special Session on Real-Parameter Optimization - RForge.net
- ↑ LM-BFGS from Wikipedia
- ↑ LM-CMA source code
- ↑ AI finds novel way to beat classic Q*bert Atari video game, BBC News, March 01, 2018
- ↑ Q*bert from Wikipedia