Logo
Explore Help
Register Sign In
xxm/opro_demo
1
0
Fork 0
You've already forked opro_demo
Code Issues Pull Requests Actions Packages Projects Releases Wiki Activity
Files
26f8e0c6484f268f3482fa97d9b79627356b8fb6
opro_demo/requirements.txt

8 lines
101 B
Plaintext
Raw Normal View History

feat: add Docker support for offline deployment with qwen3:14b Major additions: - All-in-One Docker image with Ollama + models bundled - Separate deployment option for existing Ollama installations - Changed default model from qwen3:8b to qwen3:14b - Comprehensive deployment documentation Files added: - Dockerfile: Basic app-only image - Dockerfile.allinone: Complete image with Ollama + models - docker-compose.yml: Easy deployment configuration - docker-entrypoint.sh: Startup script for all-in-one image - requirements.txt: Python dependencies - .dockerignore: Exclude unnecessary files from image Scripts: - export-ollama-models.sh: Export models from local Ollama - build-allinone.sh: Build complete offline-deployable image - build-and-export.sh: Build and export basic image Documentation: - DEPLOYMENT.md: Comprehensive deployment guide - QUICK_START.md: Quick reference for common tasks Configuration: - Updated config.py: DEFAULT_CHAT_MODEL = qwen3:14b - Updated frontend/opro.html: Page title to 系统提示词优化
2025-12-08 10:10:38 +08:00
fastapi==0.109.0
uvicorn==0.27.0
requests==2.31.0
numpy==1.26.3
scikit-learn==1.4.0
pydantic==2.5.3
Reference in New Issue Copy Permalink
Powered by Gitea Version: 1.24.6 Page: 20ms Template: 2ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API