🖥️ IDE Integration

Use LLM Proxy with Cursor IDE

Configure Cursor to use your LLM proxy for AI-powered coding. Access multiple providers, unified billing, and centralized monitoring while maintaining Cursor's native AI experience.

cursor-settings.json
// Cursor Settings - LLM Proxy
"cursor.ai.openaiApiKey": "proxy-key-here",
"cursor.ai.openaiBaseUrl": "https://proxy.example.com/v1",
"cursor.ai.model": "gpt-4"

⏱️ Setup Time

2 minutes to configure

✅ Compatibility

All Cursor AI features work

🔄 Providers

Switch models without restart

📊 Monitoring

Track usage via proxy dashboard

Configuration Steps

Set up Cursor IDE to use your LLM proxy

1
Open Cursor Settings
Required

Navigate to Cursor settings to configure the AI provider connection. Use the keyboard shortcut or menu option.

Keyboard Shortcut
# Windows/Linux
Ctrl + ,

# macOS
Cmd + ,
2
Configure API Endpoint
Required

Set your proxy endpoint as the OpenAI base URL. Cursor will route all AI requests through your proxy.

Settings Configuration
{
    "cursor.ai.openaiBaseUrl": "https://your-proxy.com/v1",
    "cursor.ai.openaiApiKey": "your-proxy-api-key"
}
3
Select Model
Optional

Choose your preferred model. Use proxy-prefixed names to route to specific providers.

🟢
gpt-4
OpenAI GPT-4
🟣
claude-3-5-sonnet
Anthropic Claude
🔵
gemini-pro
Google Gemini
4
Verify Connection
Recommended

Test the connection by using Cursor's AI features. All requests should appear in your proxy dashboard.

Test Result
$ cursor-ai-test
✓ Connection successful
✓ Model: gpt-4
✓ Provider: openai (via proxy)

Supported Features

All Cursor AI features work through your proxy

Code Generation

Generate code from comments and descriptions with any model.

💬

Chat Interface

Use Cursor's chat with Claude, GPT-4, or any provider.

🔄

Code Refactoring

Refactor and improve code using your preferred model.

🔍

Code Explanation

Get explanations for complex code sections.

🐛

Bug Detection

Identify and fix bugs with AI assistance.

📝

Documentation

Generate documentation and comments automatically.

Proxy vs Direct

Benefits of using proxy with Cursor

Feature Direct Provider With Proxy
Multiple Providers Separate configs Single config ✓
Cost Tracking Per-provider dashboards Unified dashboard ✓
Model Switching Restart required Instant switch ✓
Rate Limiting Per-provider limits Smart pooling ✓
Failover Manual switching Automatic ✓
API Keys Multiple to manage One key ✓

Start Using Cursor with Proxy

Configure Cursor IDE to use your LLM proxy for unified access to all AI providers. Better monitoring, cost control, and flexibility.