Skip to content
#

llama

Here are 28 public repositories matching this topic...

Complete AI automation stack optimized for Mac Mini M4, but can work in multiple machine configurations. Features n8n workflows, Ollama with Llama 3.2, Open WebUI, LiteLLM proxy, and MCP integration. Production-ready with automated backups, security, and family-safe configuration for you to learn more about AI at home.

  • Updated Oct 31, 2025
  • Shell

A shell script to automatically update or build llama.cpp with optimal GPU support. Supports cross-platform, smart architecture detection, safe updates, and easy installation. Choose fast binary downloads or custom source builds for maximum performance.

  • Updated Aug 1, 2025
  • Shell

Improve this page

Add a description, image, and links to the llama topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the llama topic, visit your repo's landing page and select "manage topics."

Learn more