01README
Ollama LLM integration (generate, batch generate, unload)
02Models
@keeb/ollamav2026.03.28.1ollama.ts
fn generate(prompt: string, input: string)
Send a prompt and input to Ollama, return structured output
| Argument | Type | Description |
|---|---|---|
| prompt | string | System prompt / instructions |
| input | string | Input to process |
fn unload()
Unload the model from VRAM to free GPU memory
fn generate_batch(prompt: string)
Send multiple inputs through the same prompt (factory: one resource per input)
| Argument | Type | Description |
|---|---|---|
| prompt | string | System prompt / instructions |
Resources
result(infinite)— LLM generation result
03Skills
ollama1 file
04Previous Versions
2026.04.22.2Apr 22, 2026
Added 1 skills
2026.04.06.1Apr 7, 2026
2026.04.02.1Apr 2, 2026
Modified 1 models
2026.03.28.1Mar 29, 2026
05Stats
A
100 / 100
Downloads
7
Archive size
6.4 KB
- Has README or module doc2/2earned
- README has a code example1/1earned
- README is substantive1/1earned
- Most symbols documented1/1earned
- No slow types1/1earned
- Has description1/1earned
- At least one platform tag (or universal)1/1earned
- Two or more platform tags (or universal)1/1earned
- License declared1/1earned
- Verified public repository2/2earned
Repository
https://github.com/keeb/swamp-ollama06Security Notice
This extension includes AI agent skills that can modify AI assistant behavior. Review the skill files before installing.
07Platforms
08Labels