Are large language models (LLMs) our new operating systems? If so, they are changing the definition of what we consider to be software.
Also: 8 ways to write better ChatGPT prompts – and get the results you want faster
Several analogies are used to describe the impact of fast-evolving AI technologies, such as utilities, time-sharing systems, and operating systems. Andrej Karpathy, co-founder of OpenAI and former senior director of AI at Tesla, believes that an operating system is the most apt analogy. In this scenario, LLMs are essentially new kinds of computers that orchestrate memory and compute for problem-solving activities.
LLMs are complex software ecosystems, Karpathy explained in a talk at a recent startup conference in San Francisco. There are many parallels between these ecosystems and the operating systems of yore. Tellingly, LLM environments “have a few closed-source providers like Windows or Mac OS, with an open-source alternative like Linux,” Karpathy related. “The Llama ecosystem is a close approximation to something like Linux.”
Also: How to turn AI into your own research assistant with this free Google tool
Utility and timesharing computing are analogies that can be applied to LLMs because they are ubiquitous and their build requires high capital. “We’re in this 1960s-ish era, where LLM compute is still very expensive for this new kind of computer,” Karpathy explained. “That forces the LLMs to be centralized in the cloud, and we’re just thin clients that interact with it over the network.”
This cost and complexity mean that, in a sense, a “personal computing revolution” hasn’t happened yet with LLMs, since “it’s just not economical and doesn’t make sense,” he added.
Unlike current operating systems, a common graphical user interface has not been developed for LLMs. “Whenever I talk to ChatGPT or some LLM directly in text, I feel like I’m talking to an operating system through a terminal,” said Karpathy. “A GUI hasn’t yet really been invented in a general way. Should ChatGPT have a GUI different than just text bubbles? Certainly, some apps have a GUI, but there’s no GUI across all the tasks.”
However, despite these challenges, progress is rapid. Karpathy suggested that we are entering the era of “Software 3.0.” While the development of programs in Software 1.0 involved coding into a system, and Software 2.0 was based on neural nets, Software 3.0 uses prompts “in our native language of English.”
Also: AI agents win over professionals – but only to do their grunt work, Stanford study finds
Karpathy suggested that his previous experience working on Tesla autopilot technology illustrates the shift from Software 1.0 to 2.0, and onto 3.0. “They were going through a software stack to produce the steering and acceleration,” he explained.
“At one time, there was a ton of C++ code around in the autopilot, which was the Software 1.0 code, and there were some neural nets in there doing image recognition. As we made the autopilot better, the neural network grew in capability and size — while all the C++ code was being deleted. A lot of the capabilities and functionality that were originally written in 1.0 were migrated to 2.0.”
He said this approach leveraged information from images taken from multiple cameras, conducted through a neural network: “We were able to delete a lot of code. The Software 2.0 stack quite literally ate through the Software 1.0 stack of the autopilot.”
Also: Don’t be fooled into thinking AI is coming for your job – here’s the truth
Karpathy said it’s possible to see this transition in development at a broader scale as we move to Software 3.0. “We have a new kind of software and it’s eating through the stack,” he said.
“We now have three completely different programming paradigms. If you’re entering the industry, it’s a very good idea to be fluent in all of them, because they all have pros and cons. You may want to program some functionality in 1.0 or 2.0 or 3.0. Are you going to train a neural net? Are you going to just prompt an LLM? Should this be a piece of code that’s explicit?”
Software development and deployment are changing rapidly, with a shift from heads-down coding of commands to interactive dialogs with machines. In the process, this new era is opening a wide range of application possibilities.
Get the morning’s top stories in your inbox each day with our Tech Today newsletter.
Source link
#Software #powered #LLMs #prompts #vibe #coding