An Ollama chat web application


Keywords
ollama, chatbot, conversational, AI, artificial, intelligence
License
MIT
Install
pip install ollama-chat==0.9.19

Documentation

ollama-chat

PyPI - Status PyPI GitHub PyPI - Python Version

Ollama Chat is a web chat client for Ollama that allows you to chat locally (and privately) with Large Language Models (LLMs).

Features

  • Platform independent - tested on macOS, Windows, and Linux
  • Chat with any local Ollama model
  • Save conversations for later viewing and interaction
  • Single and multiline prompts
  • Regenerate the most recent conversation response
  • Delete the most recent conversation exchange
  • View responses as Markdown or text
  • Save conversations as Markdown text
  • Multiple concurrent chat responses (with proper Ollama configuration)

Installation

To get up and running with Ollama Chat follows these steps:

  1. Install and start Ollama

  2. Install Ollama Chat

    pip install ollama-chat
    

Updating

To update Ollama Chat:

pip install -U ollama-chat

Start Ollama Chat

To start Ollama Chat, open a terminal prompt and run the Ollama Chat application:

ollama-chat

A web browser is launched and opens the Ollama Chat web application.

By default, a configuration file, "ollama-chat.json", is created in the user's home directory.

Start Conversation from CLI

To start a conversation from the command line, use the -m argument:

ollama-chat -m "Why is the sky blue?"

File Format and API Documentation

Ollama Chat File Format

Ollama Chat API

Development

This package is developed using python-build. It was started using python-template as follows:

template-specialize python-template/template/ ollama-chat/ -k package ollama-chat -k name 'Craig A. Hobbs' -k email 'craigahobbs@gmail.com' -k github 'craigahobbs' -k noapi 1