Skip to content

Commit 221987d

Browse files
authored
Docs for v0.1.17 (#75)
* Add muxing docs * Add OpenRouter endpoint docs * Add Kodu docs
1 parent 1a93f17 commit 221987d

31 files changed

+805
-221
lines changed

docs/about/changelog.md

+17-2
Original file line numberDiff line numberDiff line change
@@ -13,12 +13,27 @@ Major features and changes are noted here. To review all updates, see the
1313

1414
Related: [Upgrade CodeGate](../how-to/install.md#upgrade-codegate)
1515

16-
- **New integration: Open Interpreter** - xx Feb\
17-
2025 CodeGate v0.1.16 introduces support for
16+
- **Model muxing** - 7 Feb, 2025\
17+
With CodeGate v0.1.17 you can use the new `/v1/mux` endpoint to configure
18+
model selection based on your workspace! Learn more in the
19+
[model muxing guide](../features/muxing.md).
20+
21+
- **OpenRouter endpoint** - 7 Feb, 2025\
22+
CodeGate v0.1.17 adds a dedicated `/openrouter` provider endpoint for
23+
OpenRouter users. This endpoint currently works with Continue, Cline, and Kodu
24+
(Claude Coder).
25+
26+
- **New integration: Open Interpreter** - 4 Feb, 2025\
27+
CodeGate v0.1.16 added support for
1828
[Open Interpreter](https://github.com/openinterpreter/open-interpreter) with
1929
OpenAI-compatible APIs. Review the
2030
[integration guide](../integrations/open-interpreter.mdx) to get started.
2131

32+
- **New integration: Claude Coder** - 28 Jan, 2025\
33+
CodeGate v0.1.14 also introduced support for Kodu's
34+
[Claude Coder](https://www.kodu.ai/extension) extension. See the
35+
[integration guide](../integrations/kodu.mdx) to learn more.
36+
2237
- **New integration: Cline** - 28 Jan, 2025\
2338
CodeGate version 0.1.14 adds support for [Cline](https://cline.bot/) with
2439
Anthropic, OpenAI, Ollama, and LM Studio. See the

docs/features/muxing.md

+129
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,129 @@
1+
---
2+
title: Model muxing
3+
description: Configure a per-workspace LLM
4+
sidebar_position: 35
5+
---
6+
7+
## Overview
8+
9+
_Model muxing_ (or multiplexing), allows you to configure your AI assistant once
10+
and use [CodeGate workspaces](./workspaces.mdx) to switch between LLM providers
11+
and models without reconfiguring your development environment. This feature is
12+
especially useful when you're working on multiple projects or tasks that require
13+
different AI models.
14+
15+
For each CodeGate workspace, you can select the AI provider and model
16+
combination you want to use. Then, configure your AI coding tool to use the
17+
CodeGate muxing endpoint `http://localhost:8989/v1/mux` as an OpenAI-compatible
18+
API provider.
19+
20+
To change the model currently in use, simply switch your active CodeGate
21+
workspace.
22+
23+
```mermaid
24+
flowchart LR
25+
Client(AI Assistant/Agent)
26+
CodeGate{CodeGate}
27+
WS1[Workspace-A]
28+
WS2[Workspace-B]
29+
WS3[Workspace-C]
30+
LLM1(OpenAI/<br>o3-mini)
31+
LLM2(Ollama/<br>deepseek-r1)
32+
LLM3(OpenRouter/<br>claude-35-sonnet)
33+
34+
Client ---|/v1/mux| CodeGate
35+
CodeGate --> WS1
36+
CodeGate --> WS2
37+
CodeGate --> WS3
38+
WS1 --> |api| LLM1
39+
WS2 --> |api| LLM2
40+
WS3 --> |api| LLM3
41+
```
42+
43+
## Use cases
44+
45+
- You have a project that requires a specific model for a particular task, but
46+
you also need to switch between different models during the course of your
47+
work.
48+
- You want to experiment with different LLM providers and models without having
49+
to reconfigure your AI assistant/agent every time you switch.
50+
- Your AI coding assistant doesn't support a particular provider or model that
51+
you want to use. CodeGate's muxing provides an OpenAI-compatible abstraction
52+
layer.
53+
- You're working on a sensitive project and want to use a local model, but still
54+
have the flexibility to switch to hosted models for other work.
55+
- You want to control your LLM provider spend by using lower-cost models for
56+
some tasks that don't require the power of more advanced (and expensive)
57+
reasoning models.
58+
59+
## Configure muxing
60+
61+
To use muxing with your AI coding assistant, you need to add one or more AI
62+
providers to CodeGate, then select the model you want to use on a workspace.
63+
64+
CodeGate supports the following LLM providers for muxing:
65+
66+
- Anthropic
67+
- llama.cpp
68+
- LM Studio
69+
- Ollama
70+
- OpenAI (and compatible APIs)
71+
- OpenRouter
72+
- vLLM
73+
74+
### Add a provider
75+
76+
1. In the [CodeGate dashboard](http://localhost:9090), open the **Providers**
77+
page from the **Settings** menu.
78+
1. Click **Add Provider**.
79+
1. Enter a display name for the provider, then select the type from the
80+
drop-down list. The default endpoint and authentication type are filled in
81+
automatically.
82+
1. If you are using a non-default endpoint, update the **Endpoint** value.
83+
1. Optionally, add a **Description** for the provider.
84+
1. If the provider requires authentication, select the **API Key**
85+
authentication option and enter your key.
86+
87+
When you save the settings, CodeGate connects to the provider to retrieve the
88+
available models.
89+
90+
:::note
91+
92+
For locally-hosted models, you must use `http://host.docker.internal` instead of
93+
`http://localhost`
94+
95+
:::
96+
97+
### Select the model for a workspace
98+
99+
Open the settings of one of your [workspaces](./workspaces.mdx) from the
100+
Workspace selection menu or the
101+
[Manage Workspaces](http://localhost:9090/workspaces) screen.
102+
103+
In the **Preferred Model** section, select the model to use with the workspace.
104+
105+
### Manage existing providers
106+
107+
To edit a provider's settings, click the Manage button next to the provider in
108+
the list. For providers that require authentication, you can leave the API key
109+
field blank to preserve the current value.
110+
111+
To delete a provider, click the trash icon next to it. If this provider was in
112+
use by any workspaces, you will need to update their settings to choose a
113+
different provider/model.
114+
115+
### Refresh available models
116+
117+
To refresh the list of models available from a provider, in the Providers list,
118+
click the Manage button next to the provider to refresh, then save it without
119+
making any changes.
120+
121+
## Configure your client
122+
123+
Configure the OpenAI-compatible API base URL of your AI coding assistant/agent
124+
to `http://localhost:8989/v1/mux`. If your client requires a model name and/or
125+
API key, you can enter any values since CodeGate manages the model selection and
126+
authentication.
127+
128+
For specific instructions, see the
129+
[integration guide](../integrations/index.mdx) for your client.

docs/features/workspaces.mdx

+9-4
Original file line numberDiff line numberDiff line change
@@ -25,9 +25,13 @@ Workspaces offer several key features:
2525

2626
- **Custom instructions**: Customize your interactions with LLMs by augmenting
2727
your AI assistant's system prompt, enabling tailored responses and behaviors
28-
for different types of tasks. CodeGate includes a library of community prompts
29-
that can be easily customized for specific tasks. You can also create your
30-
own.
28+
for different types of tasks. Choose from CodeGate's library of community
29+
prompts or create your own.
30+
31+
- [**Model muxing**](./muxing.md): Configure the LLM provider/model for each
32+
workspace, allowing you to configure your AI assistant/agent once and switch
33+
between different models on the fly. This is useful when working on multiple
34+
projects or tasks that require different AI models.
3135

3236
- **Prompt and alert history**: Your LLM interactions (prompt history) and
3337
CodeGate security detections (alert history) are recorded in the active
@@ -112,7 +116,8 @@ In the workspace list, open the menu (**...**) next to a workspace to
112116
**Activate**, **Edit**, or **Archive** the workspace.
113117

114118
**Edit** opens the workspace settings page. From here you can rename the
115-
workspace, set the custom prompt instructions, or archive the workspace.
119+
workspace, select the LLM provider and model (see [Model muxing](./muxing.md)),
120+
set the custom prompt instructions, or archive the workspace.
116121

117122
**Archived** workspaces can be restored or permanently deleted from the
118123
workspace list or workspace settings screen.

docs/how-to/configure.md

+9-17
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,15 @@
11
---
2-
title: Configure CodeGate
2+
title: Advanced configuration
33
description: Customizing CodeGate's application settings
4-
sidebar_position: 20
4+
sidebar_position: 30
55
---
66

77
## Customize CodeGate's behavior
88

9-
The CodeGate container runs with default settings to support Ollama, Anthropic,
10-
and OpenAI APIs with typical settings. To customize the behavior, you can add
11-
extra configuration parameters to the container as environment variables:
9+
The CodeGate container runs with defaults that work with supported LLM providers
10+
using typical settings. To customize CodeGate's application settings like
11+
provider endpoints and logging level, you can add extra configuration parameters
12+
to the container as environment variables:
1213

1314
```bash {2}
1415
docker run --name codegate -d -p 8989:8989 -p 9090:9090 \
@@ -31,22 +32,13 @@ CodeGate supports the following parameters:
3132
| `CODEGATE_OPENAI_URL` | `https://api.openai.com/v1` | Specifies the OpenAI engine API endpoint URL. |
3233
| `CODEGATE_VLLM_URL` | `http://localhost:8000` | Specifies the URL of the vLLM server to use. |
3334

34-
## Example: Use CodeGate with OpenRouter
35+
## Example: Use CodeGate with a remote Ollama server
3536

36-
[OpenRouter](https://openrouter.ai/) is an interface to many large language
37-
models. CodeGate's vLLM provider works with OpenRouter's API when used with the
38-
Continue IDE plugin.
39-
40-
To use OpenRouter, set the vLLM URL when you launch CodeGate:
37+
Set the Ollama server's URL when you launch CodeGate:
4138

4239
```bash {2}
4340
docker run --name codegate -d -p 8989:8989 -p 9090:9090 \
44-
-e CODEGATE_VLLM_URL=https://openrouter.ai/api \
41+
-e CODEGATE_OLLAMA_URL=https://my.ollama-server.example \
4542
--mount type=volume,src=codegate_volume,dst=/app/codegate_volume \
4643
--restart unless-stopped ghcr.io/stacklok/codegate
4744
```
48-
49-
Then,
50-
[configure the Continue IDE plugin](../integrations/continue.mdx?provider=vllm)
51-
to use CodeGate's vLLM endpoint (`http://localhost:8989/vllm`) along with the
52-
model you'd like to use and your OpenRouter API key.

docs/how-to/dashboard.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
---
22
title: Access the dashboard
33
description: View alerts and usage history
4-
sidebar_position: 30
4+
sidebar_position: 20
55
---
66

77
## Enable dashboard access

docs/index.md

+20-3
Original file line numberDiff line numberDiff line change
@@ -31,6 +31,20 @@ sequenceDiagram
3131
deactivate CodeGate
3232
```
3333

34+
## Key features
35+
36+
CodeGate includes several key features for privacy, security, and coding
37+
efficiency, including:
38+
39+
- [Secrets encryption](./features/secrets-encryption.md) to protect your
40+
sensitive credentials
41+
- [Dependency risk awareness](./features/dependency-risk.md) to update the LLM's
42+
knowledge of malicious or deprecated open source packages
43+
- [Model muxing](./features/muxing.md) to quickly select the best LLM
44+
provider/model for your current task
45+
- [Workspaces](./features/workspaces.mdx) to organize and customize your LLM
46+
interactions
47+
3448
## Supported environments
3549

3650
CodeGate supports several development environments and AI providers.
@@ -41,20 +55,23 @@ AI coding assistants / IDEs:
4155

4256
- **[Cline](./integrations/cline.mdx)** in Visual Studio Code
4357

44-
CodeGate supports Ollama, Anthropic, OpenAI-compatible APIs, and LM Studio
45-
with Cline
58+
CodeGate supports Ollama, Anthropic, OpenAI and compatible APIs, OpenRouter,
59+
and LM Studio with Cline
4660

4761
- **[Continue](./integrations/continue.mdx)** with Visual Studio Code and
4862
JetBrains IDEs
4963

5064
CodeGate supports the following AI model providers with Continue:
5165

5266
- Local / self-managed: Ollama, llama.cpp, vLLM
53-
- Hosted: Anthropic, OpenAI and OpenAI-compatible APIs like OpenRouter
67+
- Hosted: Anthropic, OpenAI and compatible APIs, and OpenRouter
5468

5569
- **[GitHub Copilot](./integrations/copilot.mdx)** with Visual Studio Code
5670
(JetBrains coming soon!)
5771

72+
- **[Kodu / Claude Coder](./integrations/kodu.mdx)** in Visual Studio Code with
73+
OpenAI-compatible APIs
74+
5875
- **[Open Interpreter](./integrations/open-interpreter.mdx)** with
5976
OpenAI-compatible APIs
6077

docs/integrations/aider.mdx

+3
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,9 @@ CodeGate works with the following AI model providers through aider:
1717
- Hosted:
1818
- [OpenAI](https://openai.com/api/) and OpenAI-compatible APIs
1919

20+
You can also configure [CodeGate muxing](../features/muxing.md) to select your
21+
provider and model using [workspaces](../features/workspaces.mdx).
22+
2023
:::note
2124

2225
This guide assumes you have already installed aider using their

docs/integrations/cline.mdx

+37-4
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,11 @@ CodeGate works with the following AI model providers through Cline:
1818
- [LM Studio](https://lmstudio.ai/)
1919
- Hosted:
2020
- [Anthropic](https://www.anthropic.com/api)
21-
- [OpenAI](https://openai.com/api/) and OpenAI-compatible APIs
21+
- [OpenAI](https://openai.com/api/) and compatible APIs
22+
- [OpenRouter](https://openrouter.ai/)
23+
24+
You can also configure [CodeGate muxing](../features/muxing.md) to select your
25+
provider and model using [workspaces](../features/workspaces.mdx).
2226

2327
## Install the Cline extension
2428

@@ -42,10 +46,36 @@ in the VS Code documentation.
4246

4347
import ClineProviders from '../partials/_cline-providers.mdx';
4448

49+
:::note
50+
51+
Cline has two modes: Plan and Act. Each mode can be uniquely configured with a
52+
different provider and model, so you need to configure both.
53+
54+
:::
55+
4556
To configure Cline to send requests through CodeGate:
4657

47-
1. Open the Cline extension sidebar from the VS Code Activity Bar and open its
48-
settings using the gear icon.
58+
1. Open the Cline extension sidebar from the VS Code Activity Bar. Note your
59+
current mode, Plan or Act.
60+
61+
<ThemedImage
62+
alt='Cline mode - plan'
63+
sources={{
64+
light: useBaseUrl('/img/integrations/cline-mode-plan-light.webp'),
65+
dark: useBaseUrl('/img/integrations/cline-mode-plan-dark.webp'),
66+
}}
67+
width={'400px'}
68+
/>
69+
<ThemedImage
70+
alt='Cline mode - act'
71+
sources={{
72+
light: useBaseUrl('/img/integrations/cline-mode-act-light.webp'),
73+
dark: useBaseUrl('/img/integrations/cline-mode-act-dark.webp'),
74+
}}
75+
width={'400px'}
76+
/>
77+
78+
1. Open the Cline settings using the gear icon.
4979

5080
<ThemedImage
5181
alt='Cline extension settings'
@@ -60,7 +90,10 @@ To configure Cline to send requests through CodeGate:
6090

6191
<ClineProviders />
6292

63-
1. Click **Done** to save the settings.
93+
1. Click **Done** to save the settings for your current mode.
94+
95+
1. Switch your Cline mode from Act to Plan or vice-versa, open the settings, and
96+
repeat the configuration for your desired provider & model.
6497

6598
## Verify configuration
6699

0 commit comments

Comments
 (0)