NocLLM

A highly lightweight, dynamic C++ library to interact with Cloud and Local LLMs on ESP32/ESP8266.

Author
Muhammad Ikhwan Fathulloh
Maintainer
Nocturnailed Community
Website
https://github.com/Nocturnailed-Community/NocLLM
Category
Communication
License
MIT
Library Type
Contributed
Architectures
esp32, esp8266

NocLLM natively supports streaming responses from Large Language Models (LLMs) via OpenAI-compatible endpoints. It dynamically switches between HTTPS for Cloud (OpenAI, Groq) and HTTP for Local APIs (LM Studio, Ollama). Uses a zero-dependency, ultra-lightweight custom JSON extraction approach for fast performance and low RAM usage.

Downloads

Filename Release Date File Size
NocLLM-1.0.0.zip 2026-04-11 13.56 KiB