View all AI news articles

Meta's Stealthy Launch of Llama 2 Long: The AI That Outshines GPT-3.5 and Claude 2

February 27, 2024

Meta Platforms, known for its social media services like Facebook, Instagram, and WhatsApp, quietly made a significant announcement. The company released a research paper on arXiv.org detailing a new AI model called Llama 2 Long. This model is an extension of their open-source Llama 2 but is designed to handle longer text sequences. Notably, it outperforms some of the leading AI models, including OpenAI's GPT-3.5 Turbo and Claude 2, in generating responses to long user prompts.

The Genesis of Llama 2 Long

Meta researchers took the original Llama 2 and modified it to handle longer text sequences. They included an additional 400 billion tokens to the training dataset and made necessary modifications to the positional encoding. This new model comes in various sizes, ranging from 7 billion to 70 billion parameters.

The Technical Nitty-Gritty

The key change was in the Rotary Positional Embedding (RoPE) encoding, which is crucial for the model to attend to longer sequences. This modification allows the model to include more "distant tokens," or those that occur more rarely, in its knowledge base.

Performance Metrics

Using Reinforcement Learning from Human Feedback (RLHF), the researchers were able to improve the model's performance in tasks like coding, math, language understanding, and common sense reasoning. This has led to impressive results, making Llama 2 Long a formidable competitor in the AI landscape.

Open Source vs. Closed Source

The release of Llama 2 Long has been well-received by the open-source AI community. It serves as a validation of Meta's open-source approach and shows that it can compete with closed-source models offered by well-funded startups.

FAQ

What is Llama 2 Long?

Llama 2 Long is a new AI model released by Meta Platforms that is designed to handle longer text sequences and outperforms some leading AI models.

How is it different from the original Llama 2?

It includes an additional 400 billion tokens in its training dataset and has undergone modifications to handle longer text sequences.

What does this mean for the AI community?

The release of Llama 2 Long has been well-received and serves as a validation of open-source approaches in AI development.

Recent articles

View all articles