Compare commits
2 Commits
dd0ccb83b0
...
064b46b47c
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
064b46b47c | ||
|
|
91970d395f |
@ -2,8 +2,8 @@
|
|||||||
title: "How to get started with the Pi coding agent (on a VPS)",
|
title: "How to get started with the Pi coding agent (on a VPS)",
|
||||||
author: "Willem",
|
author: "Willem",
|
||||||
tags: ~w(agentic-engeeringing getting-started how-to),
|
tags: ~w(agentic-engeeringing getting-started how-to),
|
||||||
description: "",
|
description: "Setting up Pi on a VPS is easier than I thought. Open source with a great unboxing experience. Follow along and enjoy!",
|
||||||
published: false
|
published: true
|
||||||
}
|
}
|
||||||
---
|
---
|
||||||
|
|
||||||
@ -65,37 +65,15 @@ One thing you can do in Pi without a model, is use `!` to run a shell command. I
|
|||||||
|
|
||||||
``` sh
|
``` sh
|
||||||
!cat [..]/lib/node_modules/@mariozechner/pi-coding-agent/docs/providers.md
|
!cat [..]/lib/node_modules/@mariozechner/pi-coding-agent/docs/providers.md
|
||||||
```
|
|
||||||
[..]
|
|
||||||
|
|
||||||
## Custom Providers
|
|
||||||
|
|
||||||
**Via models.json:** Add Ollama, LM Studio, vLLM, or any provider that speaks a supported API (OpenAI Completions, OpenAI Responses, Anthropic Messages, Google Generative AI). See [models.md](models.md).
|
|
||||||
|
|
||||||
## Resolution Order
|
|
||||||
|
|
||||||
When resolving credentials for a provider:
|
|
||||||
|
|
||||||
1. CLI `--api-key` flag
|
|
||||||
2. `auth.json` entry (API key or OAuth token)
|
|
||||||
3. Environment variable
|
|
||||||
4. Custom provider keys from `models.json`
|
|
||||||
|
|
||||||
|
|
||||||
... 179 more lines (ctrl+o to expand)
|
|
||||||
|
|
||||||
```
|
```
|
||||||
|
|
||||||
I prefer the environment variable, so I don't have it in a text file, but the latter is more fire and forget. Up to you.` If you press ctrl+o you can probably find exactly what you need for your provider.
|
Now we can edit `$HOME/pi/agent/models.json` to set our provider endpoint and api key. You can specify models there as well, but that is not necessary to get started. `/model` in the UI lets you search provided models.
|
||||||
Because Pi can hot-reload it's configuration files, and everything can go in one file, I will start there.
|
|
||||||
|
|
||||||
Now we can edit `$HOME/pi/agent/models.json` to set our provider endpoint and first model.
|
|
||||||
|
|
||||||
I already have set up 'pi' on another machine, so I asked it. Next section co-written with Qwen3.6:27b :
|
I already have set up 'pi' on another machine, so I asked it. Next section co-written with Qwen3.6:27b :
|
||||||
|
|
||||||
|
|
||||||
Tell Pi where your LLM is hosted, and what model you want to use
|
Tell Pi where your LLM is hosted (by Qwen)
|
||||||
====
|
---
|
||||||
|
|
||||||
Pi has built-in OpenRouter support. You just need to configure it in ~/.pi/agent/models.json.
|
Pi has built-in OpenRouter support. You just need to configure it in ~/.pi/agent/models.json.
|
||||||
|
|
||||||
@ -125,31 +103,30 @@ Pi has built-in OpenRouter support. You just need to configure it in ~/.pi/agent
|
|||||||
1. Open Pi and run /model to see available OpenRouter models
|
1. Open Pi and run /model to see available OpenRouter models
|
||||||
2. Pick one with /model openrouter/anthropic/claude-sonnet-4 (or whatever model you want)
|
2. Pick one with /model openrouter/anthropic/claude-sonnet-4 (or whatever model you want)
|
||||||
|
|
||||||
Per-Model Routing (Optional)
|
|
||||||
|
|
||||||
You can control which upstream provider OpenRouter routes to using modelOverrides:
|
|
||||||
|
|
||||||
```json
|
|
||||||
{
|
|
||||||
"providers": {
|
|
||||||
"openrouter": {
|
|
||||||
"apiKey": "OPENROUTER_API_KEY",
|
|
||||||
"modelOverrides": {
|
|
||||||
"anthropic/claude-sonnet-4": {
|
|
||||||
"compat": {
|
|
||||||
"openRouterRouting": {
|
|
||||||
"only": ["anthropic"]
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
See the full compat.openRouterRouting options in the models docs for order, ignore, max_price, preferred_min_throughput, etc.
|
See the full compat.openRouterRouting options in the models docs for order, ignore, max_price, preferred_min_throughput, etc.
|
||||||
|
|
||||||
|
First prompt
|
||||||
|
----
|
||||||
|
|
||||||
|
Finally, we can issue a prompt. I learned (from Claude Code) that asking a model to write a haiku is a way to get started when you just want to see if it works. Sometimes I put my frustrations in the prompt. Not this time.
|
||||||
|
|
||||||
|
> Model: qwen3.5:35b-a3b
|
||||||
|
|
||||||
|
|
||||||
|
Me: write a haiku
|
||||||
|
|
||||||
|
|
||||||
|
(reasoning tokens:)
|
||||||
|
> The user is asking me to write a haiku. A haiku is a form of Japanese poetry with three lines following a 5-7-5 syllable pattern. Let me write one about nature, which is the traditional subject.
|
||||||
|
|
||||||
|
> Soft rain falls gently,
|
||||||
|
> Droplets kiss the green leaves now,
|
||||||
|
> Spring's breath fills the air.
|
||||||
|
|
||||||
|
I hope this works for you. Enjoy!
|
||||||
|
|
||||||
|
|
||||||
Afterword
|
Afterword
|
||||||
----
|
----
|
||||||
|
|
||||||
|
|||||||
Loading…
x
Reference in New Issue
Block a user