Fedor Fomin - Publications#


Co-author of three influential books:

Fomin, Fedor V.; Kratsch, Dieter (2010). Exact Exponential Algorithms. Springer. ISBN 978-3-642-16532-0.

Cygan, Marek; Fomin, Fedor V.; Kowalik, Lukasz; Lokshtanov, Daniel; Marx, Daniel; Pilipczuk, Marcin; Pilipczuk, Michal; Saurabh, Saket (2015). Parameterized Algorithms. Springer. ISBN 978-3-319-21274-6.

Fomin, Fedor V.; Lokshtanov, Daniel; Saurabh, Saket; Zehavi, Meirav (2019). Kernelization:

Theory of Parameterized Preprocessing. Cambridge University Press. ISBN 9781107415157.

Prof. Fomin is responsible for several breakthroughs in Theory of Algorithms, including

Bidimensionality Theory - the unifying framework for various subexponential parameterized, approximation and kernelization algorithms designed by many researchers for different problems on planar graphs. Bidimensionality Theory builds on deep theorems from another domain, the Graph Minors Theory of Robertson and Seymour, which is one of the deepest and significant achievements in modern Combinatorics. Prior to this work, Graph Minors were considered of pure theoretical interests, and Prof. Fomin with his coauthors were the first to extended profound mathematical results from Graph Minors into new powerful algorithmic tools. The EATCS-IPEC Nerode Prize 2015 for outstanding papers in the area of multivariate algorithmics was awarded to Prof. Fomin and his coauthors (Eric D. Demaine (MIT), Mohammadtaghi Hajiaghayi (Univ. of Maryland), and Dimitrios M. Thilikos (CNRS)) for their joint paper establishing the foundations of bidimensionality theory [Subexponential Parameterized Algorithms on Bounded-Genus Graphs and H-Minor-Free Graphs, Journal of the ACM, vol. 52, No. 6, November 2005.]

Citing the laudation of David Eppstein, Georg Gottlob and Jan Arne Telle:

”…The paper opened a new area of research, and gave rise to a significant number of follow-up papers. The work of these four nominees in developing bidimensionality theory can be seen as a “roots" contribution - since parameterized complexity was originally stimulated by graph minor theory and its algorithmic consequences.”

Measure & Conquer, a novel methodology, allowing to improve and simplify the analyses of branching algorithms drastically. Many classical algorithms were improved by making use of Measure & Conquer, and this technique is de facto the most common tool in the modern analyses of branching algorithms. The paper of Prof. Fomin and his coauthors (Fabrizio Grandoni (IDSIA, University of Lugano, Switzerland) and Dieter Kratsch (Université de Lorraine - Metz, France)).

[A Measure & Conquer Approach for the Analysis of Exact Algorithms Journal of the ACM 65 (5): Article 25, 2009.] became the classical reference in the field. In 2017 it received the EATCS-IPEC Nerode Prize 2017 for outstanding papers in the area of multivariate algorithmics for their paper. As David Eppstein, Daniel Marx, and Jianer Chen wrote in the laudation ”…Measure and conquer unified and generalized earlier attempts to formalize the analysis of backtracking algorithms. It improved our understanding of the branch-and-reduce backtracking process, provided an easy to use framework, and opened new perspectives for development of practically feasible algorithms for solving hard computational problems.”

Preprocessing (kernelization) algorithms. Prof. Fomin is also responsible for a number of very general results (meta-theorems) establishing the generic conditions on the existence of polynomial kernelization algorithms for various problems. This work exposes the deep relations between logic, combinatorial structures, and preprocessing algorithms. A representative paper in this area is [Hans L. Bodlaender, Fedor V. Fomin, Daniel Lokshtanov, E. Penninkx, Saket Saurabh, and Dimitrios M. Thilikos, (Meta) Kernelization, Journal of the ACM 63 (5), article No. 44, (2016).]

Imprint Privacy policy « This page (revision-4) was last changed on Wednesday, 3. July 2019, 13:02 by System
  • operated by