TL;DR
Sourcegraph Cody is safe for most uses, with strong enterprise options including self-hosted deployment. Its unique advantage is deep codebase awareness through Sourcegraph's code intelligence. Enterprise tier offers on-premise deployment for maximum security. Like all AI tools, generated code needs review, but Cody's architecture and enterprise options are solid.
What is Cody?
Cody is Sourcegraph's AI coding assistant that combines large language models with Sourcegraph's code intelligence platform. This gives it deep understanding of your entire codebase, not just the current file. It's available as a VS Code extension, JetBrains plugin, and through Sourcegraph's web interface.
Our Verdict
What's Good
- Deep codebase understanding
- Self-hosted option available
- Multiple LLM choices
- SOC 2 Type II certified
- Good free tier
What to Watch
- Requires indexing your code
- More complex setup than others
- Generated code needs review
- Enterprise features cost more
- Smaller community
Codebase-Aware AI
Cody's unique feature is its integration with Sourcegraph's code search and intelligence:
How it works: Cody indexes your codebase using Sourcegraph, allowing it to find relevant code across your entire repository when answering questions or generating code. This means more contextually accurate suggestions.
Security Implications
The codebase indexing has security considerations:
- Your code is indexed by Sourcegraph (cloud or self-hosted)
- Queries may include code snippets
- Better context means better suggestions
- Self-hosted option keeps everything in your network
Deployment Options
| Feature | Free | Pro | Enterprise |
|---|---|---|---|
| Completions | Limited | Unlimited | Unlimited |
| Chat messages | Limited | Unlimited | Unlimited |
| Self-hosted | No | No | Yes |
| LLM choice | Default | Multiple | Custom + BYOM |
| Admin controls | No | Basic | Full |
Privacy and Data Handling
Cloud Version
- Code context sent to Sourcegraph cloud
- LLM requests may go to third-party providers
- SOC 2 Type II certified infrastructure
- No training on customer code
Self-Hosted Enterprise
- All code stays in your network
- Can use your own LLM (BYOM)
- No external API calls required
- Full control over data
Cody vs Competitors
| Feature | Cody | Copilot | Cursor |
|---|---|---|---|
| Codebase awareness | Full repo indexed | Current files | Project context |
| Self-hosted | Enterprise | No | Enterprise |
| LLM flexibility | Multiple + BYOM | OpenAI only | Multiple |
| Free tier | Yes | Trial only | Limited |
| Code search | Yes (Sourcegraph) | No | Basic |
Best Practices
Using Cody Safely
- Configure permissions: Control which repos are indexed
- Review generated code: Like all AI, output needs verification
- Use Enterprise for sensitive code: Self-hosted keeps code in your network
- Set up access controls: Limit who can query your codebase
Does Cody store my code?
Cody indexes your code through Sourcegraph for search and context. In cloud deployments, this index is on Sourcegraph's servers. Enterprise self-hosted deployments keep everything in your infrastructure. Neither version uses your code for training.
Can I use Cody without Sourcegraph?
The VS Code extension can work standalone with basic functionality. However, Cody's unique value comes from codebase-wide context through Sourcegraph. Without it, Cody works similarly to other AI coding tools.
What LLMs does Cody use?
Cody can use multiple LLMs including Claude, GPT-4, and others depending on your plan. Enterprise customers can bring their own models (BYOM) for complete control over AI processing.
Is Cody better for large codebases?
Yes, Cody excels with large codebases because of its Sourcegraph integration. It can find and use relevant code from anywhere in your repository, not just nearby files. This is particularly valuable for monorepos and complex projects.