Advertisement
Advertisement

Parameters are essentially a way for LLMs to generate predictions and offer a very rough assessment of how sophisticated each model is. For comparison, GPT-3, which was the first to power OpenAI’s ChatGPT, had 175 billion parameters. The company has not revealed how many parameters GPT-4 has, but Semafor reported last month that the latest version of OpenAI’s LLM has 1 trillion parameters. However, the number of parameters doesn’t necessarily inform the quality of the results the AI generates, and more parameters usually mean it costs much more power to actually generate content.

Stability AI is aware that it needs to punch up to compete with its bigger, Microsoft-supported competitors. The tool was developed to help “everyday people and everyday firms use AI to unlock creativity.” The company advertised that the company is “focused on efficient, specialized, and practical AI performance—not a quest for god-like intelligence.” That last bit seems a specific dig at OpenAI, whose execs seem obsessed with the idea of super-intelligent AI.

Advertisement

On Twitter, Mostaque said both the LLM and its training data will only get better with time, saying he wants it to eventually process 3 trillion tokens, which could best be described as units of text, whether that’s letters or words.

Advertisement

Stability AI has long been evangelical in the way it talks about AI, with Mostaque often sounding the horn for proliferated, open-source AI programs, come hell or high water. But the company has reportedly struggled with money lately as it’s spent so much on developing its AI projects and richer companies soak up the attention. The startup recently showed off its enterprise-focused Stable Diffusion XL model that’s meant to be even better than the company’s previous AI image generators. Still, the company said it still plans to open source this newer generative AI model… eventually.