Latest updates to the Sourcegraph Code Intelligence Platform!
Cody releases will now have more thorough and detailed release notes.
Cody now supports the updated Claude 3.5 Sonnet and Claude 3.5 Haiku models across all plans.
We now have an experimental version of Cody available for Visual Studio, allowing users access to Cody chat. This is the first step in the journey towards general availability and we'd love to hear your feedback.
Prompts can now be promoted to show at the top of the Prompts list, allowing organizations to highlight specific Prompts that encourage best practices or are recommended to use.
The new search experience launched in August is now the default experience for users, with the option to opt out back to the old view. The new search experience brings improvements in speed and simplifies the search and navigation process.
Prompt and command use and discovery has been simplified in JetBrains and for Cody Enterprise users, and the chat window has been tidied up so that only Prompt drafts you own will appear in the UI.
Previously, Sourcegraph only supported enforcing username-based restrictions to individual files and folders based on the Perforce protections table. In this release, Sourcegraph now supports access control based on a user's IP address as well.
Cody has a new interface for quickly selecting Prompts and commands from the sidebar.
Cody v1.38 brings significant improvements to chat performance and updates to the autocomplete provider.
In the latest Sourcegraph release for Enterprise Cloud customers we've made improvements to the context fetched from @-mentioned repos, providing greater precision and increasing the likelihood Cody returns higher quality responses.
Single job pods for native Kubernetes Executors have been the default method for the past couple of releases, and based off positive feedback we are now removing multiple job pod configurations as a configuration option.
DeepSeek-V2 was recently introduced as the default autocomplete model for Cody Enterprise customers, and we have implemented optimizations to prompt caching and direct routing that have resulted in improved latency and quality for both single-line and multi-line completions.
Smart Apply now supports the executing of commands in the terminal, allowing users to execute a suggestion Cody provides with the simple click of a button.
OpenAI just unveiled its latest models, 01-preview and o1-mini, which offer enhanced reasoning capabilities for tackling complex tasks. Both models are now available in Cody as a limited release.
We are sunsetting Cody Ignore in this release. Cody Ignore was an experimental feature that was off by default for all users, only enabled via the experimental feature setting in the Cody extension. It lets users specify files or folders in a .cody/ignore file which were then ignored by Cody and disregarded from context.
Cody is built on model interoperability and we aim to provide access to the best and latest models, and today we’re making an update to the default models offered to Enterprise customers: DeepSeek-V2 is the recommended default for autocomplete, while Claude 3.5 Sonnet is now the recommended default for chat and prompts.
The core experience of using Cody is alongside your code in your IDE of choice, but there are often times when you want to interact with Cody via the web. It can be particularly helpful as part of a workflow where you’re performing a search via Code Search and need help from Cody, or if you’re on your mobile device and want to ask it a question. We’re happy to announce that Cody Web is now generally available and includes numerous improvements to make the web experience better and more consistent.
Smart Apply is now available for JetBrains, allowing users to take suggestions from the Cody chat window and near-instantly turning them into diffs in your code.
In our quest for consistency across how you use Cody, JetBrains users now get the same side panel experience found in Cody for VS Code and on the web. Chat lives in the side panel, there are now dedicated tabs for easier navigation, and prompts are now easier to discover and use.
Search Jobs are now generally available, allowing for search queries to be run across your entire codebase where completeness is prioritized over quick response times and results ranking.