Expanded support for self-hosted LLMs
Customers using self-hosted models can now configure custom headers and non-legacy completions for greater flexibility.

Customers using self-hosted models can now configure custom headers and non-legacy completions for greater flexibility.
As of Sourcegraph 6.4, customers using self-hosted models through the openaicompatible provider can now configure:
These updates make Cody compatible with a wider range of LLM gateways and inference servers, giving customers more flexibility to use their preferred infrastructure.
Learn more about model support in our docs.