OpenClaw: GPT-5.1, 5.2, and 5.3 Rejected via Codex OAuth โ Only gpt-5.4 Works
If you're using the openai-codex provider with ChatGPT/Codex OAuth and
hitting "The 'gpt-5.x' model is not supported when using Codex with a ChatGPT account"
errors โ this is a known regression (GitHub
#67158),
not your config. Here's what's happening and how to work around it.
What's Breaking
When authenticating with OpenClaw via the openai-codex / ChatGPT-Codex OAuth
path, only gpt-5.4 is currently accepted. Requests to gpt-5.1,
gpt-5.2, and gpt-5.3 variants all fail with:
{"detail":"The 'gpt-5.1' model is not supported when using Codex with a ChatGPT account."}
The model name in the error changes to match whichever variant you tried. The underlying
cause appears to be a server-side policy change on OpenAI's end โ the Codex OAuth path
now only permits the latest gpt-5.4 model, regardless of what OpenClaw
has configured.
This is a regression โ these models worked previously through the same auth path.
Secondary Issue: Refresh Token Reuse Errors
Some users hitting this bug are also seeing a second error in the auth status:
[openai-codex] Token refresh failed: 401
"message": "Your refresh token has already been used to generate a new access token. Please try signing in again."
"code": "refresh_token_reused"
This is a separate problem โ a stale OAuth refresh token that needs to be re-issued. If you're seeing both errors, fix the refresh token first (re-auth), then verify the model issue.
Immediate Fix: Set gpt-5.4 as Default, Remove Fallbacks
Update your openclaw.json to use only openai-codex/gpt-5.4
and remove any fallback entries pointing to older model variants:
{
"models": {
"default": "openai-codex/gpt-5.4"
}
}
If you had fallbacks configured like gpt-5.3-codex or gpt-5.2-codex,
remove them โ they'll just cause failed requests when the primary model is unavailable.
Either remove fallbacks entirely or switch them to a different provider (see below).
Fix the Refresh Token Error
If you're also seeing the refresh_token_reused error, re-authenticate:
openclaw auth logout openai-codex
openclaw auth login openai-codex
This re-issues a fresh OAuth token pair. The "already been used" error means your current refresh token was invalidated server-side โ logging out and back in gets you a clean token.
Longer-Term Workaround: Use the OpenAI API Key Instead
The ChatGPT/Codex OAuth path is inherently dependent on what OpenAI chooses to allow through it. If you need access to specific GPT-5 model variants and don't want to be subject to OAuth-tier policy restrictions, switch to a direct OpenAI API key:
{
"providers": {
"openai": {
"apiKey": "sk-..."
}
},
"models": {
"default": "openai/gpt-5.1"
}
}
Direct API access gives you full model selection control. The tradeoff is that it bills against your OpenAI API account rather than your ChatGPT subscription โ check your pricing tier before switching.
Context Window Comparison
If you were using older model variants specifically for their context window, here's what you're working with on gpt-5.4 via Codex OAuth:
openai-codex/gpt-5.4โ 1,025K context (as reported byopenclaw models list)openai-codex/gpt-5.3-codexโ 266K (fallback, now rejected via OAuth)
gpt-5.4 actually has a larger context window than the 5.3 fallback, so for most use cases this is a net neutral or improvement once you remove the broken fallbacks.
Status
GitHub issue #67158 is open. The fix may need to come from OpenAI re-enabling those models on the OAuth path โ there's no OpenClaw config change that forces OpenAI to accept a rejected model. If you need gpt-5.1/5.2/5.3 specifically, the direct API key route is the cleanest path.
OpenAI provider config tripping you up?
ClawReady reviews your full model and provider config and makes sure your fallbacks are set up correctly โ no broken auth, no silent failures. $49 audit.
Book a Free Call โ