This is the Linux app named lms whose latest release can be downloaded as lmssourcecode.zip. It can be run online in the free hosting provider OnWorks for workstations.
Download and run online this app named lms with OnWorks for free.
Follow these instructions in order to run this app:
- 1. Downloaded this application in your PC.
- 2. Enter in our file manager https://www.onworks.net/myfiles.php?username=XXXXX with the username that you want.
- 3. Upload this application in such filemanager.
- 4. Start the OnWorks Linux online or Windows online emulator or MACOS online emulator from this website.
- 5. From the OnWorks Linux OS you have just started, goto our file manager https://www.onworks.net/myfiles.php?username=XXXXX with the username that you want.
- 6. Download the application, install it and run it.
SCREENSHOTS:
lms
DESCRIPTION:
lms is a command-line interface tool designed to interact with and manage local large language models through the LM Studio ecosystem. The tool allows developers to control model execution directly from the terminal, providing programmatic access to features that are otherwise available through graphical interfaces. Through the CLI, users can load and unload models, start or stop local inference servers, and inspect the inputs and outputs generated by language models. LMS is built using the LM Studio JavaScript SDK and integrates tightly with the LM Studio runtime environment. The interface is designed to simplify automation workflows and scripting tasks related to local AI deployment. By exposing model management capabilities through command-line commands, the tool enables developers to integrate local LLM operations into development pipelines and backend services. As a result, LMS acts as a bridge between interactive local AI tools and automated software development workflows.
Features
- Command-line interface for controlling local language models
- Tools for loading and unloading models from system memory
- Ability to start and stop local inference API servers
- Inspection of raw model inputs and outputs during execution
- Integration with LM Studio runtime and SDK ecosystem
- Support for scripting and automation of local AI workflows
Programming Language
TypeScript
Categories
This is an application that can also be fetched from https://sourceforge.net/projects/lms.mirror/. It has been hosted in OnWorks in order to be run online in an easiest way from one of our free Operative Systems.