Tomáš Mikolov
Head of Foundational AI, Czech Institute of Informatics, Robotics and Cybernetics
Tomáš Mikolov is recognized as a pioneer in the field of natural language processing. After completing his PhD at Brno University of Technology, Czech Republic, in 2012, he joined Google as a research scientist, where he published his seminal paper on training sophisticated word representations (now commonly referred to as embeddings) with neural networks. Mikolov’s approach became known as "Word2Vec", a breakthrough that has left a lasting mark on the field. He continued his successful research career at Facebook until 2020, before joining the Czech Institute of Informatics, Robotics and Cybernetics in Prague.
“We should abandon the emphasis on state-of-the-art results and complex models and focus on discovering interesting new ideas.”
In the ongoing AI debate about whether more computing power or new methods will shape the future, Mikolov leans toward the latter. “[H]aving more computing power never hurts,” he notes in an interview with the Benelux Association for Artificial Intelligence (BAAI), reflecting on the history of recurrent neural architectures in the 2010s. However, he emphasizes that “the deciding factor for the[ir] popularity in the research community was how to use these algorithms correctly.” Open-sourcing code and datasets, he adds, was another key factor.
In a conversation with THE RECURSIVE, Mikolov expresses his ambition to build a research group dedicated to fundamental AI problems, as well as to establish a local AI company in the Czech Republic. He is currently advocating for more funding to support these initiatives.
Regarding his research philosophy, he proposes in his conversation with BAAI: “We should abandon the emphasis on state-of-the-art results and complex models and focus on discovering interesting new ideas.”