After subscribing to Zhipu's Coding package, with a mindset of using more to earn more, I have applied Zhipu's model to the Chatbox and CLI tools I use, and recently added the 4.6 model, so I need to use it more. I plan to also configure Copilot, but the OAIModels feature in VSCode is still in the experimental stage, and I have encountered quite a few pitfalls while adding models.
To use custom models, I found two methods: one is a community plugin, and the other is the CustomOAIModels configuration parameter. However, I only succeeded with the community method; the CustomOAIModels parameter does not display the added models and configure the API key.
Method 1: Install Plugin#
This plugin was recommended by someone in the issue section of the repository.
Issue link
Add custom OpenAI endpoint configuration (Base URL & Model Parameter) of copilot chat settings #7518
Plugin link
OAI Compatible Provider for Copilot
After downloading this plugin from the VSCode marketplace, click on the plugin configuration to jump to the JSON configuration page and add the following content.
"oaicopilot.baseUrl": "https://open.bigmodel.cn/api/coding/paas/v4",
"oaicopilot.models": [
{
"id": "glm-4.6",
"owned_by": "zhipuai",
"context_length": 200000,
"max_tokens": 132000,
},
{
"id": "glm-4.5",
"owned_by": "zhipuai",
"context_length": 128000,
"max_tokens": 96000,
},
{
"id": "glm-4.5-air",
"owned_by": "zhipuai",
"context_length": 128000,
"max_tokens": 96000,
},
]
For more configurations, please check the repository.
After completing the configuration settings, use ctrl+shift+p to open the model management configuration page.
Select OAI Compatible, fill in the API key, and check the models you need to use.
Finally, you will see the added models in the dialog box, and the logic for using other model providers is the same.
Method 2: CustomOAIModels Configuration#
This configuration method was proposed in a PR in the Copilot repository, but after configuring it, I still cannot find the newly added models in the dialog box, nor can I configure my API key. CustomOAIModels is shown as an experimental feature in the configuration, so whether it works is indeed a matter of luck.
Add support for generic OAI endpoints #621
The configuration method is to add the above configuration in the VSCode's setting.json, for example
"github.copilot.chat.customOAIModels": {
"gpt-4.1": {
"name": "GPT-4.1 Custom",
"maxInputTokens": 64768,
"maxOutputTokens": 16192,
"toolCalling": true,
"url": "https://api.openai.com/v1/chat/completions",
"vision": true,
"requiresAPIKey": true
},
"openai/gpt-oss-20b": {
"name": "GPT-OSS-20B",
"maxInputTokens": 32768,
"maxOutputTokens": 8192,
"toolCalling": false,
"url": "http://127.0.0.1:1234/v1/chat/completions",
"vision": false,
"requiresAPIKey": false
}
}
There is also a mention of this method in an older Copilot repository, so it might work.
Add github.copilot.chat.customOAIModels configuration description #43
Similarly, the official documentation also mentions this method.
AI language models in VS Code
So this method is also available for everyone to reference. Anyway, I was unable to configure it successfully.
Reference Links#
- https://docs.bigmodel.cn/cn/coding-plan/tool/others
- https://code.visualstudio.com/docs/copilot/customization/language-models
- https://github.com/doggy8088/github-copilot-configs/issues/43
- https://github.com/microsoft/vscode-copilot-chat/pull/621
- https://github.com/microsoft/vscode-copilot-release/issues/7518#issuecomment-3294489890
- https://code.visualstudio.com/api/extension-guides/ai/language-model-chat-provider#overview