2024-05-28 新增开源LLM

llamacpp

LLM inference in Fortran

An intellligent AI assistant that can do anything!

🤖 Sam-assistant is a personal assistant that is designed to understand your documents, search the internet, and in future versions, create and understand images, and communicate with you. It is built in Python, mainly using Langchain and implements most of Langchain

A 3rd party testground for KoboldCPP, a simple one-file way to run various GGML/GGUF models with KoboldAI’s UI. (for KCCP Frankenstein, in CPU mode, CUDA, CLBLAST, or VULKAN)

Simplify and accelerate AI-powered application development with structured interfaces to models and powerful prompt execution environments.

Demo python script app to interact with llama.cpp server using whisper API, microphone and webcam devices.

ASP.NET Core Web, WebApi & WPF implementations for LLama.cpp & LLamaSharp

Rack API application for Llama.cpp

A Qt GUI for large language models

A Javascript library (with Typescript types) to parse metadata of GGML based GGUF files.

Wingman is the fastest and easiest way to run Llama models on your PC or Mac.

Local LLM inference Library

GPTStonks is a financial chatbot powered by LLMs and enhanced with data frameworks. It provides natural language conversation capabilities for financial topics, making it an ideal choice for a wide range of financial applications.

Concepts and examples on using and training LLMs

CompanionLLM - A framework to finetune LLMs to be your own sentient conversational companion

100% Private & Simple. OSS 🐍 Code Interpreter for LLMs 🦙

Empower your LLM to do more than you ever thought possible with these state-of-the-art prompt templates.

telegram bot for self-hosted local inference of stable diffusion, text-to-speech and large language models, such as llama3

Getting an LLM to work with Godot.

A guidance compatibility layer for llama-cpp-python

A set of bash scripts to automate deployment of GGML/GGUF models [default: RWKV] with the use of KoboldCpp on Android - Termux

“Pacha” TUI (Text User Interface) is a JavaScript application that utilizes the “blessed” library. It serves as a frontend for llama.cpp and provides a convenient and straightforward way to perform inference using local language models.