Sepp Hochreiter
Professor at Johannes Kepler University Linz, Founder of NXAI
Instead of the $1 billion that Amazon reaped by employing self-normalizing neural networks, Sepp Hochreiter – the mind behind the concept – walked away with a handshake and a Mojito. This isn’t the first time a major U.S. tech company has profited handsomely from one of his innovations. Before the introduction of the Transformer architecture in 2017 – the foundation of most large language models today – AI was largely driven by Long-Short-Term Memory (LSTM) models, which Hochreiter developed with his supervisor Jürgen Schmidhuber at TU München in the 1990s. Their groundbreaking paper on LSTMs is widely regarded as one of the most cited AI papers of the 20th century, earning Hochreiter the informal title of one of AI’s “godfathers.”
As a professor at JKU Linz in Austria, Hochreiter has set his sights on an ambitious new goal: replacing the current Transformer models with an updated version of LSTMs, dubbed xLSTMs. These models promise not only greater computational efficiency but also enhanced robustness, thanks to their focus on sequential contexts. To promote the adoption of his new models, Hochreiter has launched the startup NXAI. In his characteristic witty style, he declared in Zeit Online: "the empire strikes back," expressing hope that this time, European-made AI will bring more benefits to the old continent.