Jaroslav Beck is launching a new startup together with renowned scientist Tomáš Mikolov, which aims to bring a breakthrough in the development of large language models that power today’s artificial intelligence. The founding team includes David Herel as the lead developer, and their mission is very ambitious. Jaroslav Beck is investing 10M USD (equals to 240M czk).
Beck’s new goal is to create a prototype that will demonstrate how to dramatically increase the efficiency of language models. Specifically, the trio behind BottleCap AI talk about training up to 100 times more efficient.
“We want to achieve a really dramatic jump in efficiency, but we will do it gradually. As with any new technology, we will gradually improve in a short time until our research is confirmed on large models,” Beck explains.
It also implies that the team have no ambition to compete with the technology giants in computing power, but are focusing on innovating the training process itself. In the first phase, the company will not focus on a new language model based on ChatGPT or DeepSeek, but will try to find a solution to significantly improve all existing ones.
Beck does not rule out cooperation with companies that develop language models in the future, and admits that the Chinese DeepSeek has shown how much room there is in this area. “This shocked the entire community, which had not yet fundamentally addressed efficiency. In our opinion, it can be done much better, and most importantly differently.”
The AI research will be led by Tomáš Mikolov, a scientist in the AI field with experience from Microsoft, Google and Facebook.
The founding trio also includes AI researcher David Herel, who serves as the head of product at BottleCap AI and has collaborated with Mikolov in the past.