Instead, he imagines the eventual outcome to be “a multiplicity of models that are more customized, specialized, and that are going to solve different problems.”
It’s of course important to note that his company is focused on being a GitHub-like repo for exactly those sorts of specialized models, including both big models put out there by companies like OpenAI and Meta (gpt-oss and Llama 3.2, for example) and fine-tuned variants that developers have adapted to specific needs or smaller models developed by researchers. That’s essentially what Hugging Face is about.
So yes, it’s natural that Delangue would say that. However, he’s not alone. In one example, research firm Gartner predicted in April that “the variety of tasks in business workflows and the need for greater accuracy are driving the shift towards specialized models fine-tuned on specific functions or domain data.”
Regardless of which way LLM-based applications go, investment in other applications of AI is only just getting started. Earlier this week, it was revealed that former Amazon CEO Jeff Bezos will be co-CEO of a new AI startup focused on applications of machine learning in engineering and manufacturing—and that startup has launched with over $6 billion in funding.
That, too, could be a bubble. But despite that some of Delangue’s statements on the AI bubble discourse are clearly meant to prop up Hugging Face, there’s a helpful reminder in there: The overbroad term “AI” is a lot bigger than just large language models, and we’re still in the early days of seeing where these methodologies will lead us.
Source link
#LLM #bubble #Hugging #Face #CEO #saysbut










