UAE Has Launched GPT-4’s New Rival, Falcon 180B

Abu Dhabi’s TII has released Falcon 180B, a bigger version of Falcon 40B, which is an open-source language model.

Falcon 180B LLM

The official blog post says it’s the largest open-source model with a huge 180 billion parameters.

Earlier, in June, the institute released Falcon in three different sizes: 1B, 7B, and 40B.

Falcon 180B was trained using a dataset called RefinedWeb, which has 3.5 trillion tokens. The training process used up to 4096 GPUs. The model was fine-tuned using various conversational datasets focused on chat and instructions.

Falcon 180B

When compared to other models like Llama 2 and GPT-3.5, Falcon 180B is larger and more powerful. It performs well in different assessments and shows similar results to Google’s PaLM 2-Large. However, it hasn’t reached the level of GPT-4 yet.

uae-llm-falcon-180b

Commercial use of Falcon 180B is allowed but with strict restrictions, excluding hosting use. This makes it less commercially friendly than previous Falcon models.

Falcon 180B is now the largest open-source language model, surpassing Meta’s Llama 2. While Meta has been supporting open-source initiatives, it has its own restrictions and complex licensing policy.

Even Meta seems to be working on closed-door models that are expected to be bigger and better.

However, OpenAI is still considered the main player in the open-source field. But with Google’s Gemini model getting closer to release, it’s crucial for OpenAI to stay ahead by releasing GPT-5.

Related Stories:

Help Someone By Sharing This Article