Ollama
Run LLMs locally with one command
READ MORE →WEEKLY GITHUB PICKS
Updated every week with the hottest GitHub repos
Run LLMs locally with one command
READ MORE →OpenAI's speech-to-text that actually works
READ MORE →Node-based Stable Diffusion with full control
READ MORE →Self-hosted workflow automation
READ MORE →Build AI apps visually, no code needed
READ MORE →Framework for building LLM applications
READ MORE →Modern Python API framework, fast and typed
READ MORE →Drop-in OpenAI API replacement, runs locally
READ MORE →Chat with your documents, 100% private
READ MORE →Let AI control your computer
READ MORE →AI voice cloning and conversion
READ MORE →Next-gen face swapping and enhancement
READ MORE →The original Stable Diffusion web interface
READ MORE →High-throughput LLM serving engine
READ MORE →Autonomous AI agent that completes tasks
READ MORE →Run LLMs on CPU with C++
READ MORE →Build data apps in pure Python
READ MORE →Build ML demos in minutes
READ MORE →Browser automation that actually works
READ MORE →Hand-drawn style diagrams
READ MORE →S3-compatible object storage
READ MORE →Open source Firebase alternative
READ MORE →Backend server for web and mobile
READ MORE →Self-hosted uptime monitoring
READ MORE →Self-hosted Google Photos alternative
READ MORE →"Let there be light."
— Genesis 1:3