Changes

Jump to: navigation, search

Ilya Loshchilov

7,680 bytes added, 11:11, 13 September 2019
Created page with "'''Home * People * Ilya Loshchilov''' FILE:IlyaLoshchilov.jpg|border|right|thumb|link=http://loshchilov.com/| Ilya Loshchilov <ref>[http://loshchilov.com/..."
'''[[Main Page|Home]] * [[People]] * Ilya Loshchilov'''

[[FILE:IlyaLoshchilov.jpg|border|right|thumb|link=http://loshchilov.com/| Ilya Loshchilov <ref>[http://loshchilov.com/ Ilya Loshchilov]</ref> ]]

'''Ilya Gennadyevich Loshchilov''',<br/>
a Russian computer scientist in the field of [[Genetic Programming#EvolutionaryAlgorithms|evolutionary algorithms]] (EA), [https://en.wikipedia.org/wiki/Mathematical_optimization optimization] and [[Learning|machine learning]].
He worked with [[Marc Schoenauer]] and [[Michèle Sebag]] at [[University of Paris#11|Paris-Sud 11 University]] and [https://en.wikipedia.org/wiki/French_Institute_for_Research_in_Computer_Science_and_Automation INRIA] within their ''TAO Project Team'' <ref>[https://tao.lri.fr/tiki-index.php?page=People TikiWiki | People]</ref> on [https://en.wikipedia.org/wiki/Multi-objective_optimization multi-objective optimization],
[https://en.wikipedia.org/wiki/Adaptive_coordinate_descent adaptive coordinate descent] and [https://en.wikipedia.org/wiki/CMA-ES CMA-ES], where he defended his Ph.D. with the title ''Surrogate-Assisted Evolutionary Algorithms'' in 2013 <ref>[[Ilya Loshchilov]] ('''2013'''). ''[http://loshchilov.com/phd.html Surrogate-Assisted Evolutionary Algorithms]''. Ph.D. thesis, [[University of Paris#11|Paris-Sud 11 University]], advisors [[Marc Schoenauer]] and [[Michèle Sebag]]</ref>.
As postdoc at [https://en.wikipedia.org/wiki/University_of_Freiburg University of Freiburg], he continued his research with [[Frank Hutter]] combining, evolutionary algorithms with [[Deep Learning|deep learning]] techniques.

=SGDR=
Loshchilov's and Hutter's ''[https://en.wikipedia.org/wiki/Stochastic_gradient_descent stochastic gradient descent] with warm restarts'' ('''SGDR''') as introduced in their 2016 paper <ref>[[Ilya Loshchilov]], [[Frank Hutter]] ('''2016'''). ''SGDR: Stochastic Gradient Descent with Warm Restarts''. [https://arxiv.org/abs/1608.03983 arXiv:1608.03983]</ref>, is mentioned as method for [https://en.wikipedia.org/wiki/Learning_rate learning rate] scheduling to train [[Leela Chess Zero|Leela Chess']] third party network '''Leelenstein''' <ref>[https://www.patreon.com/jjosh Jjosh is creating Leelenstein | Patreon]</ref>, which is combined with [[Allie]] for the [[Allie#AllieStein|AllieStein]] chess playing entity.

=Selected Publications=
<ref>[https://scholar.google.fr/citations?user=GladWQwAAAAJ&hl=en Ilya Loshchilov - Google Scholar Citations]</ref> <ref>[https://dblp.org/pers/hd/l/Loshchilov:Ilya dblp: Ilya Loshchilov]</ref>
==2010 ...==
* [[Ilya Loshchilov]], [[Marc Schoenauer]], [[Michèle Sebag]] ('''2010'''). ''[https://link.springer.com/chapter/10.1007/978-3-642-17298-4_24 Dominance-Based Pareto-Surrogate for Multi-Objective Optimization]''. [https://dblp.org/db/conf/seal/seal2010.html SEAL 2010], [https://hal.inria.fr/inria-00522653/document pdf]
* [[Ilya Loshchilov]], [[Marc Schoenauer]], [[Michèle Sebag]] ('''2010'''). ''A mono surrogate for multiobjective optimization''. [https://dblp.org/db/conf/gecco/gecco2010.html GECCO 2010], [https://hal.inria.fr/inria-00483948/document pdf]
* [[Ilya Loshchilov]], [[Marc Schoenauer]], [[Michèle Sebag]] ('''2011'''). ''Adaptive coordinate descent.'' [https://dblp.org/db/conf/gecco/gecco2011.html GECCO 2011], [http://www.loshchilov.com/publications/GECCO2011_AdaptiveCoordinateDescent.pdf pdf] <ref>[https://en.wikipedia.org/wiki/Adaptive_coordinate_descent Adaptive coordinate descent from Wikipedia]</ref>
* [[Ilya Loshchilov]], [[Marc Schoenauer]], [[Michèle Sebag]] ('''2012'''). ''Self-Adaptive Surrogate-Assisted Covariance Matrix Adaptation Evolution Strategy''. [https://arxiv.org/abs/1204.2356 arXiv:1204.2356]
* [[Ilya Loshchilov]], [[Marc Schoenauer]], [[Michèle Sebag]] ('''2012'''). ''Alternative Restart Strategies for CMA-ES''. [https://arxiv.org/abs/1207.0206 arXiv:1207.0206]
* [[Ilya Loshchilov]], [[Marc Schoenauer]], [[Michèle Sebag]] ('''2013'''). ''KL-based Control of the Learning Schedule for Surrogate Black-Box Optimization''. [https://arxiv.org/abs/1308.2655 arXiv:1308.2655]
* [[Ilya Loshchilov]] ('''2013'''). ''[http://loshchilov.com/phd.html Surrogate-Assisted Evolutionary Algorithms]''. Ph.D. thesis, [[University of Paris#11|Paris-Sud 11 University]], advisors [[Marc Schoenauer]] and [[Michèle Sebag]]
* [[Ilya Loshchilov]] ('''2013'''). ''CMA-ES with Restarts for Solving CEC 2013 Benchmark Problems''. [https://dblp.org/db/conf/cec/cec2013.html CEC 2013], [https://hal.inria.fr/hal-00823880/document pdf] <ref>[http://www.rforge.net/cec2013/ cec2013 - Benchmark functions for the CEC 2013 Special Session on Real-Parameter Optimization - RForge.net]</ref>
* [[Ilya Loshchilov]] ('''2014'''). ''A Computationally Efficient Limited Memory CMA-ES for Large Scale Optimization''. [https://arxiv.org/abs/1404.5520 arXiv:1404.5520]
==2015 ...==
* [[Ilya Loshchilov]] ('''2015'''). ''LM-CMA: an Alternative to L-BFGS for Large Scale Black-box Optimization''. [https://arxiv.org/abs/1511.00221 arXiv:1511.00221] <ref>[https://en.wikipedia.org/wiki/Limited-memory_BFGS LM-BFGS from Wikipedia]</ref> <ref>[https://sites.google.com/site/ecjlmcma/ LM-CMA source code]</ref>
* [[Ilya Loshchilov]], [[Frank Hutter]] ('''2015'''). ''Online Batch Selection for Faster Training of Neural Networks''. [https://arxiv.org/abs/1511.06343 arXiv:1511.06343]
* [[Ilya Loshchilov]], [[Frank Hutter]] ('''2016'''). ''CMA-ES for Hyperparameter Optimization of Deep Neural Networks''. [https://arxiv.org/abs/1604.07269 arXiv:1604.07269]
* [[Ilya Loshchilov]], [[Frank Hutter]] ('''2016'''). ''SGDR: Stochastic Gradient Descent with Warm Restarts''. [https://arxiv.org/abs/1608.03983 arXiv:1608.03983]
* [[Ilya Loshchilov]], [[#TGlasmachers|Tobias Glasmachers]] ('''2016'''). ''[https://dl.acm.org/citation.cfm?id=2931698 Anytime Bi-Objective Optimization with a Hybrid Multi-Objective CMA-ES (HMO-CMA-ES)]''. [https://dblp.org/db/conf/gecco/gecco2016c.html GECCO 2016]
* [[Ilya Loshchilov]], [[Frank Hutter]] ('''2017'''). ''Decoupled Weight Decay Regularization''. [https://arxiv.org/abs/1711.05101 arXiv:1711.05101]
* [[Ilya Loshchilov]], [[#TGlasmachers|Tobias Glasmachers]], [https://de.wikipedia.org/wiki/Hans-Georg_Beyer_(Informatiker) Hans-Georg Beyer] ('''2017'''). ''Limited-Memory Matrix Adaptation for Large Scale Black-box Optimization''. [https://arxiv.org/abs/1705.06693 arXiv:1705.06693]
* [https://dblp.org/pers/hd/c/Chrabaszcz:Patryk Patryk Chrabaszcz], [[Ilya Loshchilov]], [[Frank Hutter]] ('''2018'''). ''Back to Basics: Benchmarking Canonical Evolution Strategies for Playing Atari''. [https://arxiv.org/abs/1802.08842 arXiv:1802.08842] <ref>[https://www.bbc.com/news/technology-43241936 AI finds novel way to beat classic Q*bert Atari video game], [https://en.wikipedia.org/wiki/BBC_News BBC News], March 01, 2018</ref> <ref>[https://en.wikipedia.org/wiki/Q*bert Q*bert from Wikipedia]</ref>
* [[Ilya Loshchilov]], [[#TGlasmachers|Tobias Glasmachers]], [https://de.wikipedia.org/wiki/Hans-Georg_Beyer_(Informatiker) Hans-Georg Beyer] ('''2019'''). ''Large Scale Black-Box Optimization by Limited-Memory Matrix Adaptation''. [[IEEE#EC|IEEE Transactions on Evolutionary Computation]], Vol. 23, No. 2

=External Links=
* [http://loshchilov.com/ Ilya Loshchilov]
* [https://github.com/loshchil loshchil (Ilya Loshchilov) · GitHub]
: [https://github.com/loshchil/SGDR GitHub - loshchil/SGDR]
: [https://github.com/loshchil/AdamW-and-SGDW GitHub - loshchil/AdamW-and-SGDW: Decoupled Weight Decay Regularization (ICLR 2019)]

=References=
<references />
'''[[People|Up one Level]]'''
[[Category:Researcher|Loshchilov]]
[[Category:Mathematician|Loshchilov]]

Navigation menu