This is the Linux app named llm-ollama whose latest release can be downloaded as 0.16.1sourcecode.tar.gz. It can be run online in the free hosting provider OnWorks for workstations.
Download and run online this app named llm-ollama with OnWorks for free.
Follow these instructions in order to run this app:
- 1. Downloaded this application in your PC.
- 2. Enter in our file manager https://www.onworks.net/myfiles.php?username=XXXXX with the username that you want.
- 3. Upload this application in such filemanager.
- 4. Start the OnWorks Linux online or Windows online emulator or MACOS online emulator from this website.
- 5. From the OnWorks Linux OS you have just started, goto our file manager https://www.onworks.net/myfiles.php?username=XXXXX with the username that you want.
- 6. Download the application, install it and run it.
SCREENSHOTS
Ad
llm-ollama
DESCRIPTION
llm-ollama is a plugin for the LLM CLI ecosystem that enables seamless access to models hosted on an Ollama server through a unified command-line interface. It automatically discovers available models from the connected Ollama instance and registers them for use within the CLI, making it easy to run prompts, chat sessions, and embedding operations without manual configuration. The plugin supports both local and remote Ollama servers, allowing flexible deployment scenarios ranging from local machines to cloud-hosted environments. It includes support for multimodal inputs such as image attachments, enabling interaction with vision-capable models. The system also integrates tool usage, allowing models to call predefined utilities such as web search and fetch operations to enhance responses. Additionally, it supports structured outputs through JSON schemas and asynchronous execution for advanced workflows.
Features
- Automatic discovery and registration of Ollama models
- Support for local and remote server connections
- Multimodal input handling with image attachments
- Tool integration for extended model capabilities
- JSON schema support for structured outputs
- Async execution for advanced workflows
Programming Language
Python
Categories
This is an application that can also be fetched from https://sourceforge.net/projects/llm-ollama.mirror/. It has been hosted in OnWorks in order to be run online in an easiest way from one of our free Operative Systems.