Using the fastest inference system Groq with Elephas

Groq AI provides fastest inference models, i.e responses are generated at a faster speed. They have some of the popular AI models hosted, including,

  • Llama 3 70B
  • Llama 3 7B
  • Mixtral 8x 8B
  • Gemma 7B
 

Here is the pricing info,

Notion image
 

$.59 for Llama 3 70B, one of the powerful open source models. Here are the details from Meta’s launch page https://ai.meta.com/blog/meta-llama-3/

💡
Llama 3 provided better response for English content than GPT-3.5 Turbo model in some tests.
 

Creating an API key with Groq

 

Visit https://console.groq.com/login and create your new account.

Notion image
 

then visit https://console.groq.com/keys to set up a new key.

Notion image

Then you can fill that API key under Elephas → Preferences → AI Providers, select “Groq”, enter the API key you just created.

Notion image
 

Then you can use the models provided by Groq AI in Chat Settings and Cost & Accuracy parts underPreferences → General

 
Notion image
 
 
Did this answer your question?
😞
😐
🤩