GoGPT Best VPN GoSearch

OnWorks 网站图标

MobileLLM download for Linux

Free download MobileLLM Linux app to run online in Ubuntu online, Fedora online or Debian online

This is the Linux app named MobileLLM whose latest release can be downloaded as MobileLLMsourcecode.tar.gz. It can be run online in the free hosting provider OnWorks for workstations.

Download and run online this app named MobileLLM with OnWorks for free.

请按照以下说明运行此应用程序:

- 1. 在您的 PC 中下载此应用程序。

- 2. 在我们的文件管理器 https://www.onworks.net/myfiles.php?username=XXXXX 中输入您想要的用户名。

- 3. 在这样的文件管理器中上传这个应用程序。

- 4. 从此网站启动OnWorks Linux online 或Windows online emulator 或MACOS online emulator。

- 5. 从您刚刚启动的 OnWorks Linux 操作系统,使用您想要的用户名转到我们的文件管理器 https://www.onworks.net/myfiles.php?username=XXXXX。

- 6. 下载应用程序,安装并运行。

SCREENSHOTS

Ad


移动法学硕士


商品描述

MobileLLM is a lightweight large language model (LLM) framework developed by Facebook Research, optimized for on-device deployment where computational and memory efficiency are critical. Introduced in the ICML 2024 paper “MobileLLM: Optimizing Sub-billion Parameter Language Models for On-Device Use Cases”, it focuses on delivering strong reasoning and generalization capabilities in models under one billion parameters. The framework integrates several architectural innovations—SwiGLU activation, deep and thin network design, embedding sharing, and grouped-query attention (GQA)—to achieve a superior trade-off between model size, inference speed, and accuracy. MobileLLM demonstrates remarkable performance, with the 125M and 350M variants outperforming previous state-of-the-art models of the same scale by up to 4.3% on zero-shot commonsense reasoning tasks.



功能

  • Optimized transformer architecture for sub-billion parameter LLMs
  • Combines SwiGLU activation, embedding sharing, and grouped-query attention
  • Supports distributed multi-node pretraining with PyTorch ≥ 2.0
  • Delivers state-of-the-art zero-shot reasoning results across multiple tasks
  • Includes reproducible training and evaluation pipelines for multiple model sizes
  • Scalable design philosophy extending from 125M to 1.5B parameters


程式语言

Python、Unix 外壳


分类

大型语言模型(法学硕士)

This is an application that can also be fetched from https://sourceforge.net/projects/mobilellm.mirror/. It has been hosted in OnWorks in order to be run online in an easiest way from one of our free Operative Systems.


免费服务器和工作站

下载 Windows 和 Linux 应用程序

Linux 命令

Ad




×
广告
❤️在这里购物、预订或购买——免费,有助于保持服务免费。