Groq
7 mentions across all digests
Groq is an AI inference chip startup that developed specialized processors (LPUs) optimized for fast token generation, acquired by Nvidia for approximately $20 billion in its largest deal ever.
Open source memory layer so any AI agent can do what Claude.ai and ChatGPT do
STASH, an open-source memory layer, democratizes persistent context retention for AI agents across LLM providers, bringing capabilities previously exclusive to Claude.ai and ChatGPT.
Show HN: GoModel – an open-source AI gateway in Go; 44x lighter than LiteLLM
Go-based AI gateway GoModel claims 44x performance advantage over LiteLLM while providing unified OpenAI-compatible APIs across OpenAI, Anthropic, Gemini, and other providers.
This startup is betting tokenmaxxing will create the next compute giant
Parasail raises $32M to aggregate distributed GPU capacity from 40 global data centers and undercut OpenAI/Anthropic's proprietary APIs by serving open-source model inference.
An Interview with Nvidia CEO Jensen Huang About Accelerated Computing
LWiAI Podcast #230 - 2025 Retrospective, Nvidia buys Groq, GLM 4.7, METR
Nvidia's $20B acquisition of Groq consolidates the inference chip market while METR research surfaces rising costs and efficiency concerns for long-horizon AI agents.