LLM Chat Interface (CPU Optimized)

Published:

Developed a lightweight chat interface that supports multiple large language models including LLaMA variants and DeepSeek Coder. Features CPU-optimized inference using llama.cpp with optional GPU acceleration support.

Technologies Used: Large Language Models, Python, llama.cpp, Chat Interface

View Project