Portkey provides a robust and secure gateway to integrate various Large Language Models (LLMs) into applications, including Cerebras Inference API. With Portkey, take advantage of features like fast AI gateway access, observability, prompt management, and more, while securely managing API keys through Model Catalog.Documentation Index
Fetch the complete documentation index at: https://portkey-docs-feat-support-overview-page.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Quick Start
Get Cerebras working in 3 steps:Tip: You can also set
provider="@cerebras" in Portkey() and use just model="llama3.1-8b" in the request.Add Provider in Model Catalog
- Go to Model Catalog β Add Provider
- Select Cerebras
- Choose existing credentials or create new by entering your Cerebras API key
- Name your provider (e.g.,
cerebras-prod)
Complete Setup Guide β
See all setup options, code examples, and detailed instructions
Supported Models
Cerebras Models
View all available models and documentation
Next Steps
Add Metadata
Add metadata to your Cerebras requests
Gateway Configs
Add gateway configs to your Cerebras requests
Tracing
Trace your Cerebras requests
Fallbacks
Setup fallback from OpenAI to Cerebras
SDK Reference
Complete Portkey SDK documentation

