# Groq > Fast LLM inference, OpenAI-compatible. Simple to integrate, easy to scale. Start building in minutes. ## Pages - [Overview - GroqDocs](api-reference.md): Fast LLM inference, OpenAI-compatible. Simple to integrate, easy to scale. Start building in minutes. - [Overview - GroqDocs](batch.md): Fast LLM inference, OpenAI-compatible. Simple to integrate, easy to scale. Start building in minutes. - [Overview - GroqDocs](embeddings.md): Fast LLM inference, OpenAI-compatible. Simple to integrate, easy to scale. Start building in minutes. - [Overview - GroqDocs](prompting.md): Fast LLM inference, OpenAI-compatible. Simple to integrate, easy to scale. Start building in minutes. - [Overview - GroqDocs](speech-text.md): Fast LLM inference, OpenAI-compatible. Simple to integrate, easy to scale. Start building in minutes. - [Overview - GroqDocs](text-chat.md): Fast language models have gained significant attention in recent years due to their ability to process and generate h... - [Overview - GroqDocs](text-to-speech.md): Fast LLM inference, OpenAI-compatible. Simple to integrate, easy to scale. Start building in minutes.