Skip to content

Life as I know it!

Menu
  • About Me
Menu

Tag: Ollama

Running LLMs locally with Ollama

Posted on August 21, 2025August 21, 2025 by soubhagya

Traditionally, large language models (LLMs) require high-end machines equipped with powerful GPUs due to their substantial memory and compute demands. However, recent advances have led to the emergence of smaller LLMs that can efficiently run on consumer-grade hardware. Tools like llama.cpp enable running these models directly on CPUs, making LLMs more accessible than ever. Installing and Running…

Read more
  • Running LLMs locally with Ollama
  • Reading Culture
  • How am I using GenAI?
  • August 2025
  • Gen AI
  • Opinion
  • Software Engineering

GenAI LLM Ollama Opinion Software Engineering

© 2025 Life as I know it! | Powered by Minimalist Blog WordPress Theme

Website security powered by MilesWeb