April 27, 2023
Researchers have introduced LLaMA, a collection of foundation language models ranging from 7 billion to 65 billion parameters. These models are trained on trillions of tokens, demonstrating that it's possible to create state-of-the-art models using only publicly available datasets. This is a significant departure from the reliance on proprietary and inaccessible datasets in the past.
In a remarkable achievement, LLaMA-13B outperforms the GPT-3 (175 billion parameters) on most benchmarks, and LLaMA-65B proves competitive with the best models such as Chinchilla-70B and PaLM-540B. The research team has released all of their models to the community, paving the way for further advancements in artificial intelligence and natural language processing.
Take the first step toward harnessing the power of AI for your organization. Get in touch with our experts, and let's embark on a transformative journey together.
Contact Us today