VLLM download for Linux

This is the Linux app named VLLM whose latest release can be downloaded as v0.2.1.post1sourcecode.zip. It can be run online in the free hosting provider OnWorks for workstations.

 
 

Download and run online this app named VLLM with OnWorks for free.

Follow these instructions in order to run this app:

- 1. Downloaded this application in your PC.

- 2. Enter in our file manager https://www.onworks.net/myfiles.php?username=XXXXX with the username that you want.

- 3. Upload this application in such filemanager.

- 4. Start the OnWorks Linux online or Windows online emulator or MACOS online emulator from this website.

- 5. From the OnWorks Linux OS you have just started, goto our file manager https://www.onworks.net/myfiles.php?username=XXXXX with the username that you want.

- 6. Download the application, install it and run it.

SCREENSHOTS:


VLLM


DESCRIPTION:

vLLM is a fast and easy-to-use library for LLM inference and serving. High-throughput serving with various decoding algorithms, including parallel sampling, beam search, and more.



Features

  • State-of-the-art serving throughput
  • Efficient management of attention key and value memory with PagedAttention
  • Continuous batching of incoming requests
  • Optimized CUDA kernels
  • Seamless integration with popular HuggingFace models
  • Tensor parallelism support for distributed inference


Programming Language

Python


Categories

Large Language Models (LLM)

This is an application that can also be fetched from https://sourceforge.net/projects/vllm.mirror/. It has been hosted in OnWorks in order to be run online in an easiest way from one of our free Operative Systems.



Latest Linux & Windows online programs


Categories to download Software & Programs for Windows & Linux