This is the Linux app named ChatLLM.cpp whose latest release can be downloaded as chatllm_win_x64.7z. It can be run online in the free hosting provider OnWorks for workstations.
Download and run online this app named ChatLLM.cpp with OnWorks for free.
ປະຕິບັດຕາມຄໍາແນະນໍາເຫຼົ່ານີ້ເພື່ອດໍາເນີນການ app ນີ້:
- 1. ດາວໂຫຼດຄໍາຮ້ອງສະຫມັກນີ້ໃນ PC ຂອງທ່ານ.
- 2. ໃສ່ໃນຕົວຈັດການໄຟລ໌ຂອງພວກເຮົາ https://www.onworks.net/myfiles.php?username=XXXXX ດ້ວຍຊື່ຜູ້ໃຊ້ທີ່ທ່ານຕ້ອງການ.
- 3. ອັບໂຫລດແອັບພລິເຄຊັນນີ້ຢູ່ໃນຕົວຈັດການໄຟລ໌ດັ່ງກ່າວ.
- 4. ເລີ່ມ OnWorks Linux ອອນລາຍ ຫຼື Windows online emulator ຫຼື MACOS online emulator ຈາກເວັບໄຊທ໌ນີ້.
- 5. ຈາກ OnWorks Linux OS ທີ່ເຈົ້າຫາກໍ່ເລີ່ມຕົ້ນ, ໄປທີ່ຕົວຈັດການໄຟລ໌ຂອງພວກເຮົາ https://www.onworks.net/myfiles.php?username=XXXXX ດ້ວຍຊື່ຜູ້ໃຊ້ທີ່ທ່ານຕ້ອງການ.
- 6. ດາວນ໌ໂຫລດຄໍາຮ້ອງສະຫມັກ, ຕິດຕັ້ງມັນແລະດໍາເນີນການ.
ໜ້າ ຈໍ
Ad
ChatLLM.cpp
ລາຍລະອຽດ
chatllm.cpp is a pure C++ implementation designed for real-time chatting with Large Language Models (LLMs) on personal computers, supporting both CPU and GPU executions. It enables users to run various LLMs ranging from less than 1 billion to over 300 billion parameters, facilitating responsive and efficient conversational AI experiences without relying on external servers.
ຄຸນລັກສະນະ
- Pure C++ implementation for LLM inference
- Supports models from <1B to >300B parameters
- Real-time chatting capabilities
- Compatible with CPU and GPU executions
- No dependency on external servers
- Facilitates responsive conversational AI
- Open-source and customizable
- Integrates with various LLM architectures
- ສະຫນັບສະຫນູນຊຸມຊົນຢ່າງຫ້າວຫັນ
ພາສາການຂຽນໂປຣແກຣມ
C ++
ປະເພດ
This is an application that can also be fetched from https://sourceforge.net/projects/chatllm-cpp.mirror/. It has been hosted in OnWorks in order to be run online in an easiest way from one of our free Operative Systems.