Skip to content

dkoh12/llm

Repository files navigation

Local LLM

Python 3.10+ Ollama 0.1.34+ LM Studio 0.2.20+ MIT License

Playing around with both LM Studio and Ollama. For sake of simplicity, not touching llama.cpp

Vibecoded with Github Copilot - GPT-4.1, Gemini 2.5 Pro, Claude Opus 4

Setup

  1. clone this repo

  2. Install uv - fast python package manager.

pip install uv
  1. set up virtual env (recommended)
uv venv
source venv/bin/activate    # on windows, venv\Scripts\activate
  1. install dependencies
uv pip sync requirements.txt # Or requirements-dev.txt for development

Playing with LM Studio

  1. download LM studio (https://lmstudio.ai/)

  2. download local LLM models from LM Studio

runTo interact only with LM Studio LLM Agent / chatbot, run

python src/lmstudio_api.py

Playing with Ollama

  1. download Ollama (https://ollama.com/)

  2. download local LLM models from Ollama

  3. run ollama server

ollama run codellama:latest

To interact only with Ollama LLM Agent / chatbot, run

python src/ollama_api.py

Note - Ollama and LM Studio downloads models in different locations. There are OSS tools out there to create a symlink. This repo does not come with that.

Playing with LLM

To play with the overall LLM Agent / chatbot, run

python llm.py

Available Commands

  • Chat
    • chat
  • Completion
    • auto complete
  • Models
    • To list all models from the selected provider (Ollama / LMStudio)
  • Select
    • To select a model
  • Switch
    • To switch between Ollama and LMStudio

Note - inference is kind of slow despite running the model on localhost

LLM.py is a wrapper around both Ollama and LM Studio

Running Tests

To run the unit tests for the API wrappers and agent, use the following commands from the root of the project:

To run all tests in the tests directory:

python -m unittest discover tests

To run a specific test file, for example for LMStudioAPI:

python -m unittest tests/test_lmstudio_api.py

Make sure your LM Studio and/or Ollama servers are running before running the tests, as the tests will attempt to connect to them.

Linting

Linting is done with ruff and automatically runs on git commit via pre-commit hooks.

Manual linting:

ruff check .
ruff check . --fix
ruff format .

Setup pre-commit hooks:

pre-commit install

Run pre-commit manually:

pre-commit run --all-files

Project Management

Project Management is done through pyproject.toml

# For main dependencies
uv pip compile pyproject.toml -o requirements.txt

# For development dependencies
uv pip compile pyproject.toml --extra dev -o requirements-dev.txt
# For development
uv pip sync requirements-dev.txt

# For production/CI (if you don't have dev tools there)
uv pip sync requirements.txt

License

This project is licensed under the MIT License.

About

playing around with lmstudio and ollama

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages