Jump to content

DBRX

From Wikipedia, the free encyclopedia

This is the current revision of this page, as edited by Paper9oll (talk | contribs) at 07:16, 12 November 2024 (top: Template substitution). The present address (URL) is a permanent link to this version.

(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)
DBRX
Developer(s)Mosaic ML and Databricks team
Initial releaseMarch 27, 2024
Repositoryhttps://github.com/databricks/dbrx
LicenseDatabricks Open License
Websitehttps://www.databricks.com/blog/introducing-dbrx-new-state-art-open-llm

DBRX is an open-sourced large language model (LLM) developed by Mosaic ML team at Databricks, released on March 27, 2024.[1][2][3] It is a mixture-of-experts Transformer model, with 132 billion parameters in total. 36 billion parameters (4 out of 16 experts) are active for each token.[4] The released model comes in either a base foundation model version or an instruct-tuned variant.[5]

DRBX outperforms other prominent open-source models such as Meta's LLaMA 2, Mistral AI's Mixtral, and xAI's Grok and close-sourced models such as GPT-3.5 in several benchmarks ranging from language understanding, programming ability and mathematics.[4][6][7][unreliable source?] As of March 28, 2024, this makes DBRX the world's most powerful open sourced model.[8]

It was trained in 2.5 months[8] on 3,072 Nvidia H100s connected by 3.2 terabytes per second bandwidth (InfiniBand), for a training cost of $10m USD.[1]

References

[edit]
  1. ^ a b "Introducing DBRX: A New State-of-the-Art Open LLM". Databricks. 2024-03-27. Retrieved 2024-03-28.
  2. ^ "New Databricks open source LLM targets custom development | TechTarget". Business Analytics. Retrieved 2024-03-28.
  3. ^ Ghoshal, Anirban (2024-03-27). "Databricks' open-source DBRX LLM beats Llama 2, Mixtral, and Grok". InfoWorld. Retrieved 2024-03-28.
  4. ^ a b "A New Open Source LLM, DBRX Claims to be the Most Powerful – Here are the Scores". GIZMOCHINA. Mar 28, 2024.
  5. ^ Wiggers, Kyle (2024-03-27). "Databricks spent $10M on new DBRX generative AI model". TechCrunch. Retrieved 2024-03-29.
  6. ^ "Databricks releases DBRX: open-source LLM that beats GPT-3.5 and Llama 2". Techzine Europe. 2024-03-27. Retrieved 2024-03-28.
  7. ^ "Data and AI company DataBrix has launched a general-purpose large language model (LLM) DBRX that out." Maeil Business Newspaper. 2024-03-28. Retrieved 2024-03-28.
  8. ^ a b Knight, Will. "Inside the Creation of the World's Most Powerful Open Source AI Model". Wired. ISSN 1059-1028. Retrieved 2024-03-28.