Expanded support for self-hosted LLMs

No items found.
May 28, 2025

As of Sourcegraph 6.4, customers using self-hosted models through the openaicompatible provider can now configure:

  • Custom headers: Add authentication keys or other custom HTTP headers to LLM requests
  • Non-legacy completions: Use /chat/completions instead of the older /completions endpoint for Cody’s autocomplete queries

These updates make Cody compatible with a wider range of LLM gateways and inference servers, giving customers more flexibility to use their preferred infrastructure.

Learn more about model support in our docs.

Subscribe for the latest code AI news and product updates