This is the Linux app named ChatLLM.cpp whose latest release can be downloaded as chatllm_win_x64.7z. It can be run online in the free hosting provider OnWorks for workstations.
Download and run online this app named ChatLLM.cpp with OnWorks for free.
请按照以下说明运行此应用程序:
- 1. 在您的 PC 中下载此应用程序。
- 2. 在我们的文件管理器 https://www.onworks.net/myfiles.php?username=XXXXX 中输入您想要的用户名。
- 3. 在这样的文件管理器中上传这个应用程序。
- 4. 从此网站启动OnWorks Linux online 或Windows online emulator 或MACOS online emulator。
- 5. 从您刚刚启动的 OnWorks Linux 操作系统,使用您想要的用户名转到我们的文件管理器 https://www.onworks.net/myfiles.php?username=XXXXX。
- 6. 下载应用程序,安装并运行。
SCREENSHOTS
Ad
聊天LLM.cpp
商品描述
chatllm.cpp is a pure C++ implementation designed for real-time chatting with Large Language Models (LLMs) on personal computers, supporting both CPU and GPU executions. It enables users to run various LLMs ranging from less than 1 billion to over 300 billion parameters, facilitating responsive and efficient conversational AI experiences without relying on external servers.
特性
- Pure C++ implementation for LLM inference
- Supports models from <1B to >300B parameters
- Real-time chatting capabilities
- Compatible with CPU and GPU executions
- No dependency on external servers
- Facilitates responsive conversational AI
- Open-source and customizable
- Integrates with various LLM architectures
- 积极的社区支持
程式语言
C + +中
分类
This is an application that can also be fetched from https://sourceforge.net/projects/chatllm-cpp.mirror/. It has been hosted in OnWorks in order to be run online in an easiest way from one of our free Operative Systems.