Documentation Index
Fetch the complete documentation index at: https://portkey-docs-feat-support-overview-page.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
ZhipuAI has developed the GLM series of open source LLMs that are some of the world’s best performing and capable models today. Portkey provides a robust and secure gateway to seamlessly integrate these LLMs into your applications in the familiar OpenAI spec with just 2 LOC change!
With Portkey, you can leverage powerful features like fast AI gateway, caching, observability, prompt management, and more, while securely managing your LLM API keys through a virtual key system.
Portkey SDK Integration with ZhipuAI
1. Install the Portkey SDK
Install the Portkey SDK in your project using npm or pip:
npm install --save portkey-ai
2. Initialize Portkey with the Virtual Key
To use ZhipuAI / ChatGLM / BigModel with Portkey, get your API key from here, then add it to Portkey to create the virtual key.
NodeJS SDK
Python SDK
OpenAI Node SDK
OpenAI Python SDK
import Portkey from 'portkey-ai'
const portkey = new Portkey({
apiKey: "PORTKEY_API_KEY", // defaults to process.env["PORTKEY_API_KEY"]
virtualKey: "VIRTUAL_KEY" // Your ZhipuAI Virtual Key
})
from portkey_ai import Portkey
portkey = Portkey(
api_key="PORTKEY_API_KEY", # Replace with your Portkey API key
virtual_key="VIRTUAL_KEY" # Replace with your virtual key for ZhipuAI
)
import OpenAI from "openai";
import { PORTKEY_GATEWAY_URL, createHeaders } from "portkey-ai";
const portkey = new OpenAI({
baseURL: PORTKEY_GATEWAY_URL,
defaultHeaders: createHeaders({
apiKey: "PORTKEY_API_KEY",
virtualKey: "ZHIPUAI_VIRTUAL_KEY",
}),
});
from openai import OpenAI
from portkey_ai import PORTKEY_GATEWAY_URL, createHeaders
portkey = OpenAI(
base_url=PORTKEY_GATEWAY_URL,
default_headers=createHeaders(
api_key="PORTKEY_API_KEY",
virtual_key="ZHIPUAI_VIRTUAL_KEY"
)
)
3. Invoke Chat Completions
const chatCompletion = await portkey.chat.completions.create({
messages: [{ role: 'user', content: 'Who are you?' }],
model: 'glm-4'
});
console.log(chatCompletion.choices);
I am an AI assistant named ZhiPuQingYan(智谱清言), you can call me Xiaozhi🤖
completion = portkey.chat.completions.create(
messages= [{ "role": 'user', "content": 'Say this is a test' }],
model= 'glm-4'
)
print(completion)
I am an AI assistant named ZhiPuQingYan(智谱清言), you can call me Xiaozhi🤖
curl https://api.portkey.ai/v1/chat/completions \
-H "Content-Type: application/json" \
-H "x-portkey-api-key: $PORTKEY_API_KEY" \
-H "x-portkey-virtual-key: $ZHIPUAI_VIRTUAL_KEY" \
-d '{
"messages": [{"role": "user","content": "Hello!"}],
"model": "glm-4",
}'
I am an AI assistant named ZhiPuQingYan(智谱清言), you can call me Xiaozhi🤖
Next Steps
The complete list of features supported in the SDK are available on the link below.
You’ll find more information in the relevant sections:
- Add metadata to your requests
- Add gateway configs to your ZhipuAI requests
- Tracing ZhipuAI requests
- Setup a fallback from OpenAI to ZhipuAI