A self-hosted, offline, ChatGPT-like chatbot. Powered by Llama 2. 100% private, with no data leaving your device. New: Code Llama support!
- Updated Dec 22, 2023
- TypeScript
Build software better, together
A self-hosted, offline, ChatGPT-like chatbot. Powered by Llama 2. 100% private, with no data leaving your device. New: Code Llama support!
A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API.
AGiXT is a dynamic AI Agent Automation Platform that seamlessly orchestrates instruction management and complex task execution across diverse AI providers. Combining adaptive memory, smart features, and a versatile plugin system, AGiXT delivers efficient and comprehensive AI solutions.
Jan is an open source alternative to ChatGPT that runs 100% offline on your computer
Chat with your favourite LLaMA models in a native macOS app
Run local LLaMA/GPT model easily and fast in C#!🤗 It's also easy to integrate LLamaSharp with semantic-kernel, unity, WPF and WebApp.
Believe in AI democratization. llama for nodejs backed by llama-rs, llama.cpp and rwkv.cpp, work locally on your laptop CPU. support llama/alpaca/gpt4all/vicuna/rwkv model.
A FastAPI service for semantic text search using precomputed embeddings and advanced similarity measures, with built-in support for various file types through textract.
The TypeScript library for building AI applications.
Calculate token/s & GPU memory requirement for any LLM. Supports llama.cpp/ggml/bnb/QLoRA quantization
A toolkit for controllable, private AI on consumer hardware in rust
A simple "Be My Eyes" web app with a llama.cpp/llava backend
Use Code Llama with Visual Studio Code and the Continue extension. A local LLM alternative to GitHub Copilot.
llama.cpp with BakLLaVA model describes what does it see
The "vicuna-installation-guide" provides step-by-step instructions for installing and configuring Vicuna 13 and 7B
♾️ toolkit for air-gapped LLMs on consumer-grade hardware
AI-powered cybersecurity chatbot designed to provide helpful and accurate answers to your cybersecurity-related queries and also do code analysis and scan analysis.
A fast, lightweight, embeddable inference engine to supercharge your apps with local AI. OpenAI-compatible API
A CLI and web UI to interact with LLMs in a Chat-style interface, with code execution capabilities.
Maid is a cross-platform Flutter app for interfacing with GGUF / llama.cpp models locally, and with Ollama and OpenAI models remotely.
Add a description, image, and links to the llamacpp topic page so that developers can more easily learn about it.
To associate your repository with the llamacpp topic, visit your repo's landing page and select "manage topics."