A site that tracks advances in LLM technology. Software, papers, research, directories... looks like a bit of everything.
LocalAI is a drop-in replacement REST API compatible with OpenAI API specifications for local inferencing. It allows to run models locally or on-prem with consumer grade hardware, supporting multiple models families compatible with the ggml format. Multiple LLMs are supported.
llmapi-server is an abstract backend that encapsulates a variety of large language models (LLM, such as ChatGPT, GPT-3, GPT-4, etc.), and provides simple access services through OpenAPI.