This is the Linux app named llama.vscode whose latest release can be downloaded as v0.0.45sourcecode.tar.gz. It can be run online in the free hosting provider OnWorks for workstations.
Download and run online this app named llama.vscode with OnWorks for free.
Follow these instructions in order to run this app:
- 1. Downloaded this application in your PC.
- 2. Enter in our file manager https://www.onworks.net/myfiles.php?username=XXXXX with the username that you want.
- 3. Upload this application in such filemanager.
- 4. Start the OnWorks Linux online or Windows online emulator or MACOS online emulator from this website.
- 5. From the OnWorks Linux OS you have just started, goto our file manager https://www.onworks.net/myfiles.php?username=XXXXX with the username that you want.
- 6. Download the application, install it and run it.
SCREENSHOTS:
llama.vscode
DESCRIPTION:
llama.vscode is a Visual Studio Code extension that provides AI-assisted coding features powered primarily by locally running language models. The extension is designed to be lightweight and efficient, enabling developers to use AI tools even on consumer-grade hardware. It integrates with the llama.cpp runtime to run language models locally, eliminating the need to rely entirely on external APIs or cloud providers. The extension supports common AI development features such as code completion, conversational chat assistance, and AI-assisted code editing directly within the IDE. Developers can select and manage models through a configuration interface that automatically downloads and runs the required models locally. The extension also supports agent-style coding workflows, where AI tools can perform more complex tasks such as analyzing project context or editing multiple files.
Features
- VS Code extension for AI-assisted coding and development
- Local model execution using the llama.cpp inference runtime
- Inline code completion and AI-powered editing features
- Integrated chat interface for asking questions within the IDE
- Agent-style coding workflows with project context awareness
- Optional support for external model endpoints and APIs
Programming Language
TypeScript
Categories
This is an application that can also be fetched from https://sourceforge.net/projects/llama-vscode.mirror/. It has been hosted in OnWorks in order to be run online in an easiest way from one of our free Operative Systems.