Will OpenAI use Groq chips for their LLMs in 2024?
22
419
410
Dec 31
10%
chance
Get Ṁ200 play money
Sort by:

According to groq’s website they are already working with Poe

Groq is a featured inference provider for poe.com, hosting Llama 2 70B and Mixtral 8x7b running on the LPU™ Inference Engine.

@Soli Where did this version come from?

@singer dalleee

More related questions