Configuration Overview
Connected

GitHub Copilot with LLM Proxy

Configure Visual Studio and VS Code to use your LLM proxy for GitHub Copilot. Access multiple AI providers through Copilot's interface with unified management and monitoring capabilities.

Setup Time
3 min
Quick configuration
Compatibility
100%
All features work
Providers
10+
Available models
⚙️ Configuration Settings

📋 Visual Studio Code Settings

Configure VS Code settings to route Copilot requests through your proxy.

settings.json
{
    "github.copilot.advanced": {
        "debug.overrideProxy": "https://your-proxy.com/v1",
        "debug.testOverride": true
    }
}

🔧 Environment Variables

Set environment variables for proxy authentication and endpoint configuration.

Environment Setup
# Windows
set OPENAI_API_BASE=https://your-proxy.com/v1
set OPENAI_API_KEY=your-proxy-key

# macOS/Linux
export OPENAI_API_BASE=https://your-proxy.com/v1
export OPENAI_API_KEY=your-proxy-key

Setup Steps

Configure Copilot to use your LLM proxy

1

Install Copilot Extension

Install the GitHub Copilot extension from the Visual Studio marketplace in your IDE.

2

Open Settings

Navigate to settings (Ctrl+,) and search for Copilot configuration options.

3

Configure Proxy

Set the proxy endpoint URL and API key in the Copilot advanced settings.

4

Restart IDE

Restart Visual Studio or VS Code to apply the proxy configuration.

Supported Features

Full Copilot functionality through your proxy

Feature Status Notes
Code Completion Real-time suggestions
Multi-line Completions Full function generation
Copilot Chat Conversational assistance
Code Explanation Explain selected code
Test Generation Generate unit tests
Doc Generation Documentation comments

Available Providers

Access multiple AI models through your proxy

🟢
OpenAI GPT-4
🟣
Anthropic Claude
🔵
Google Gemini
🟠
Meta Llama

Configure Copilot with Proxy

Set up GitHub Copilot to use your LLM proxy for unified access to multiple AI providers. Better cost control, monitoring, and flexibility.