<?xml version="1.0" encoding="UTF-8"?><rss version="2.0" xmlns:content="http://purl.org/rss/1.0/modules/content/"><channel><title>Thanos Lefteris</title><description>Personal website and blog of Thanos Lefteris</description><link>https://thanoslefteris.com/</link><language>en-us</language><item><title>How to Configure GitHub Copilot CLI to Use Z.ai&apos;s GLM Coding Plan</title><link>https://thanoslefteris.com/blog/copilot-cli-byok-zai-glm/</link><guid isPermaLink="true">https://thanoslefteris.com/blog/copilot-cli-byok-zai-glm/</guid><pubDate>Mon, 20 Apr 2026 00:00:00 GMT</pubDate><content:encoded>&lt;h2&gt;Introduction&lt;/h2&gt;
&lt;p&gt;GitHub Copilot CLI &lt;a href=&quot;https://github.blog/changelog/2026-04-07-copilot-cli-now-supports-byok-and-local-models/&quot;&gt;recently added&lt;/a&gt; support for BYOK (Bring Your Own Key). What this means is that you can configure Copilot CLI to use an Anthropic or OpenAI-compatible endpoint and use a model without needing a GitHub Copilot subscription. In this post I&apos;ll explain how to use the GLM-5.1 model from Z.ai&apos;s GLM Coding Plan subscription.&lt;/p&gt;
&lt;p&gt;But first, here is a short explanation of the tools used.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://github.com/features/copilot/cli&quot;&gt;GitHub Copilot CLI&lt;/a&gt; is the official TUI AI agent by GitHub. It&apos;s one of the many ways GitHub provides to use their Copilot subscription and integrates nicely with Visual Studio Code as well.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://z.ai/subscribe&quot;&gt;Z.ai&apos;s GLM Coding Plan&lt;/a&gt; is Z.ai&apos;s subscription built specifically for AI-powered coding. They have &lt;a href=&quot;https://docs.z.ai/devpack/overview&quot;&gt;documentation&lt;/a&gt; on how to use the subscription from many different coding agents.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://z.ai/blog/glm-5.1&quot;&gt;GLM-5.1&lt;/a&gt; is Z.ai&apos;s flagship model (as of writing this). It&apos;s competitive with the flagship models provided by other major AI labs.&lt;/p&gt;
&lt;h2&gt;Prerequisites&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Linux or macOS&lt;/li&gt;
&lt;li&gt;GitHub Copilot CLI &lt;a href=&quot;https://docs.github.com/en/copilot/how-tos/copilot-cli/set-up-copilot-cli/install-copilot-cli&quot;&gt;installed&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;An active Z.ai Coding subscription
&lt;ul&gt;
&lt;li&gt;A Z.ai API key&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;Configuration&lt;/h2&gt;
&lt;p&gt;There are two ways to configure Copilot CLI: using the Anthropic-compatible API or using the OpenAI-compatible API. I haven&apos;t found any difference between those two integration methods.&lt;/p&gt;
&lt;p&gt;Unfortunately there is no way to configure this in the &lt;code&gt;~/.copilot/config.json&lt;/code&gt; file. Instead you need to use the &lt;a href=&quot;https://docs.github.com/en/copilot/how-tos/copilot-cli/customize-copilot/use-byok-models#configuring-your-provider&quot;&gt;appropriate environment variables&lt;/a&gt;. You probably want to save the commands below as a shell alias, if you are going to use them often.&lt;/p&gt;
&lt;h3&gt;A. Configure using the OpenAI-compatible API&lt;/h3&gt;
&lt;pre&gt;&lt;code&gt;COPILOT_OFFLINE=true \
COPILOT_PROVIDER_TYPE=openai \
COPILOT_PROVIDER_BASE_URL=https://api.z.ai/api/coding/paas/v4 \
COPILOT_PROVIDER_API_KEY=REPLACE_WITH_YOUR_ZAI_KEY \
COPILOT_MODEL=glm-5.1 \
COPILOT_PROVIDER_MAX_PROMPT_TOKENS=204800 \
COPILOT_PROVIDER_MAX_OUTPUT_TOKENS=131072 \
copilot
&lt;/code&gt;&lt;/pre&gt;
&lt;h3&gt;B. Configure using the Anthropic-compatible API&lt;/h3&gt;
&lt;pre&gt;&lt;code&gt;COPILOT_OFFLINE=true \
COPILOT_PROVIDER_TYPE=anthropic \
COPILOT_PROVIDER_BASE_URL=https://api.z.ai/api/anthropic \
COPILOT_PROVIDER_API_KEY=REPLACE_WITH_YOUR_ZAI_KEY \
COPILOT_MODEL=glm-5.1 \
COPILOT_PROVIDER_MAX_PROMPT_TOKENS=204800 \
COPILOT_PROVIDER_MAX_OUTPUT_TOKENS=131072 \
copilot
&lt;/code&gt;&lt;/pre&gt;
</content:encoded></item></channel></rss>