macOS 26β is out, and so is Xcode 26β with LLM support. Unfortunately, I couldn’t get any of it to run as the beta refuses to allow me to use Apple Intelligence – because apparently Apple Intelligence isn’t available in virtual machines at all? Wild. But not being able to set up Apple Intelligence means not being able to use the ChatGPT models in Xcode 26β which is one of its main draws for me.
Well, along comes Simon Støvring with a helpful blog post, showing that you can make the app use your Anthropic models! Thanks, mate 🤙🏼
Naturally, I tried to get both OpenAI and OpenRouter working with my own keys, and after inspecting the network traffic and fiddling with the parameters I managed to get it partially working.

My findings, lest I forget
The “URL” parameter is the API’s base URL (must be OpenAI-compatible) up to (but not including) the /v1
, e.g. https://api.openai.com/
instead of https://api.openai.com/v1
.
The “API Key” parameter is the API key your provider gave you, e.g. sk-whatever
; the “API Key Header” parameter is x-api-key
. But:
If the provider in question uses the HTTP Authorization
header to check the API key, we’re in luck: The “API Key Header” can also be set to Authorization
– and in that case, Xcode will automatically prefix the API key with the string “Bearer”, e.g. Bearer sk-whatever
.

So I got OpenAI support working even without Apple Intelligence configured. OpenRouter, not so much: my guess is that OpenRouter’s list of available models is just too damn big for Xcode at this point. TBF, it’s a humongous list.
Thanks again to Simon for the pointer!
Oh, you’re still here. Hmm. Did you know I make pretty useful Shortcuts-related macOS & iOS apps, like …
- a contextual Shortcuts launcher
- one that brings Shortcuts support to Obsidian
- one that adds Shortcuts support to Chrome, Chromium etc.?