OpenClaw openai-codex OAuth Fails on VPS But Works on Laptop: Cloudflare Bot Check Explained
Your openai-codex OAuth setup works perfectly on your home machine. You move it to a DigitalOcean, Hetzner, or AWS VPS โ and suddenly every request fails with a Cloudflare 403 or a ChatGPT backend rejection, even though the session credentials are identical. This is issue #67798, and it's about how Cloudflare identifies VPS traffic vs residential traffic.
Why VPS Requests Get Rejected
OpenClaw's openai-codex provider uses a ChatGPT OAuth session to make API calls. ChatGPT is protected by Cloudflare, which maintains risk profiles for IP addresses. VPS provider IP ranges (DigitalOcean, Hetzner, AWS, Linode, Vultr) are well-known to Cloudflare and flagged as datacenter/bot traffic by default.
On your home machine with a residential ISP IP, the same OAuth credentials sail through. On a VPS with a datacenter IP, Cloudflare's automated checks trigger regardless of credential validity. OpenClaw can't reproduce a real browser fingerprint from a headless server environment โ so the check fails.
The error surfaces as either a Cloudflare 403 challenge page or a ChatGPT backend rejection. Both look like auth failures but are actually IP reputation blocks.
Fix #1 โ Switch to a Direct OpenAI API Key (Recommended)
The cleanest fix is abandoning the ChatGPT OAuth path and using a direct OpenAI
API key instead. API calls to api.openai.com don't go through the
ChatGPT/Cloudflare stack โ they go straight to the OpenAI API endpoint with no
browser emulation required.
// openclaw.json โ replace openai-codex with direct openai provider
{
"providers": {
"openai": {
"apiKey": "sk-your-openai-key",
"model": "gpt-5.4"
}
}
}
This works reliably from any server, any IP, any geography. It also gives you full model selection (not limited to what ChatGPT OAuth exposes), proper rate limits, and usage tracking.
Fix #2 โ Use a Residential Proxy
If you specifically need the ChatGPT OAuth path (e.g., you're on the ChatGPT subscription tier and want to use that compute), route your OpenClaw traffic through a residential proxy. Cloudflare scores residential IPs much lower on bot probability.
Configure in OpenClaw via the httpsProxy setting or system-level
proxy env vars:
export HTTPS_PROXY="http://user:pass@residential-proxy:port"
openclaw gateway restart
Residential proxy providers: Oxylabs, Bright Data, IPRoyal. Costs $5โ20/month depending on traffic. Not worth it if you can use a direct API key instead.
Fix #3 โ Move OpenClaw to a Home Machine or Tunnel It
If you want VPS-like always-on deployment without VPS IP reputation issues, run OpenClaw on a home mini PC (NUC, Mac Mini, Raspberry Pi 5) and expose it via a Cloudflare Tunnel or Tailscale. The traffic originates from your home residential IP. Cloudflare sees it as legitimate browser traffic.
This is actually the ideal setup for the openai-codex OAuth path โ it was designed for personal machines, not servers.
How to Tell If This Is Your Problem
# Test the raw endpoint from your VPS
curl -I "https://chatgpt.com/backend-api/v1/models" \
-H "User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36"
If you get a 403 with cf-mitigated: challenge in the
response headers โ this is your issue. If you get a 401 or
403 without cf-mitigated โ it's a different auth problem.
TL;DR
- openai-codex OAuth works on laptops, fails on VPS โ datacenter IPs flagged by Cloudflare
- Fix A: Switch to direct OpenAI API key (cleanest, recommended)
- Fix B: Route through a residential proxy
- Fix C: Run OpenClaw on home hardware + Cloudflare Tunnel
- Diagnosis:
cf-mitigated: challengein curl response = Cloudflare block - Tracking: GitHub issue #67798
Running OpenClaw on a VPS or server?
ClawReady can configure your server-based setup to avoid common provider pitfalls โ right provider config, right model path, right deployment pattern for your hardware.
Book a Free Call โ