豆豆友情提示:这是一个非官方 GitHub 代理镜像,主要用于网络测试或访问加速。请勿在此进行登录、注册或处理任何敏感信息。进行这些操作请务必访问官方网站 github.com。 Raw 内容也通过此代理提供。
Skip to content

darthzoloft/runpod-worker-ollama

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

87 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Runpod serverless runner for ollama

How to use

Start a runpod serverless with the docker container svenbrnn/runpod-ollama:latest. Set OLLAMA_MODEL_NAME environment to a model from ollama.com to automatically download a model. A mounted volume will be automatically used.

RunPod

Environment variables

Variable Name Description Default Value
OLLAMA_MODEL_NAME The name of the model to download NULL

Test requests for runpod.io console

See the test_inputs directory for example test requests.

Streaming

Streaming for openai requests are fully working.

Preload model into the docker image

See the embed_model directory for instructions.

Licence

This project is licensed under the Creative Commons Attribution 4.0 International License. You are free to use, share, and adapt the material for any purpose, even commercially, under the following terms:

For more details, see the license.

About

A serverless ollama worker for runpod.io

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Python 60.6%
  • Shell 20.6%
  • Dockerfile 18.8%