Meta and Groq Continue To Build Open-source Developer Ecosystem as Llama 3.2 Launches

Written by:
Groq
Share:

Since the first Llama launch in February 2023, Groq has set the fast AI inference standard for open-source models by delivering leading throughput with LPU™ AI inference technology. We’re still driving on this path today and are proud to partner with Meta as they advance the models and modalities being made available to the open-source ecosystem. This includes today’s Meta announcement of Llama 3.2, which includes small and medium-sized vision Large Language Models (LLMs) 11B and 90B, and text-only models, 1B and 3B.

Ahmad Al-Dahle, Head of GenAI at Meta, shared, “At Meta, we’re committed to providing developers and enterprises the most advanced openly-available AI models that unlock the next level of innovation. Our partnership with Groq is pivotal to this mission, as their cutting-edge, fast inference technology enables the seamless deployment of these models, giving developers the speed, agility, and scalability they need to build the next generation of AI-powered applications.”

This is the second time Groq and Meta have partnered on a key industry LLM launch, continuing to deliver on their shared commitment to building a cutting edge, cost-effective, modifiable, open AI ecosystem. This partnership ensures developers maintain their place at the front lines of innovation, and enable people to have creative, useful, and life-changing breakthroughs using generative AI. 

Groq is excited to share the preview version of a number of these models, with additional models from the 3.2 suite available via GroqCloud™ Dev Console soon. Llama 3.2 will join the other models Groq is driving industry-leading inference performance for, delivering multiple modalities as shown by text, audio (Whisper) and image (Llava) models. Llama 3.2 expands the image modalities and are fully integrated into the Llama 11B and 90B vision models.

The two largest models of the Llama 3.2 suite, 11B and 90B, support image reasoning use cases while the lightweight 1B and 3B models are text only. Today, developers can access llama-3.2-1b-preview, llama-3.2-3b-preview, llama-3.2-11b-text-preview, and llama-3.2-90b-text-preview on Groq.

With this release, Meta also shared the first official distribution of Llama Stack. Llama Stack is evidence of the developer being top of mind as Meta builds out the open-source ecosystem and enables providers like Groq to provide further resources and be aligned with the developer community.

Sunny Madra, GM GroqCloud, shared, “With almost half a million developers who have adopted GroqCloud this year, we know developers are hungry for an open-source ecosystem. We’re excited that an industry leader as large and trusted as Meta offers partnership opportunities to startups like Groq in an effort to build the foundation of that ecosystem for both developers and the enterprise.”

This year, Llama has achieved 10x growth, and continues to be one of the leading openly-available LLMs in the industry, and combined with Groq fast inference speed, the speed of generation is now truly fueling the speed of innovation.

For developers it’s an easy three step process to get started with Groq. Simply replace your existing industry standard API key with a free Groq API key, set the base URL, and run. Head over to GroqCloud Dev Console and start building today!

The latest Groq news. Delivered to your inbox.