This is the Linux app named InternLM whose latest release can be downloaded as InternLM-v0.2.1dev20240102sourcecode.tar.gz. It can be run online in the free hosting provider OnWorks for workstations.
Download and run online this app named InternLM with OnWorks for free.
Follow these instructions in order to run this app:
- 1. Downloaded this application in your PC.
- 2. Enter in our file manager https://www.onworks.net/myfiles.php?username=XXXXX with the username that you want.
- 3. Upload this application in such filemanager.
- 4. Start the OnWorks Linux online or Windows online emulator or MACOS online emulator from this website.
- 5. From the OnWorks Linux OS you have just started, goto our file manager https://www.onworks.net/myfiles.php?username=XXXXX with the username that you want.
- 6. Download the application, install it and run it.
SCREENSHOTS
Ad
InternLM
DESCRIPTION
InternLM is an open-source family of multilingual foundation and chat models, accompanied by an ecosystem that supports training, inference, and application development. The repository highlights multiple model sizes intended to serve different needs, from efficient research and prototyping to more capable deployments for complex scenarios. Beyond model weights, the project emphasizes an ecosystem view, pointing developers to compatible tools and projects across training and inference so teams can build end-to-end workflows. InternLM’s direction includes strong general-purpose capabilities and ongoing iterations that target improved reasoning, coding, and tool-use behaviors. The broader InternLM ecosystem also includes training tooling and guidance aimed at making fine-tuning and adaptation more accessible across hardware setups, including smaller single-GPU environments and larger multi-node configurations.
Features
- Open-source multilingual foundation and chat model series across multiple sizes
- Ecosystem alignment with training, inference, and application tooling
- Support for instruction-tuned variants aimed at general-purpose usage
- Guidance and compatibility paths for integrating InternLM models into downstream apps
- Training and fine-tuning support via ecosystem tooling for varied hardware constraints
- Ongoing model iterations targeting improvements in reasoning, coding, and tool use
Programming Language
Python
Categories
This is an application that can also be fetched from https://sourceforge.net/projects/internlm.mirror/. It has been hosted in OnWorks in order to be run online in an easiest way from one of our free Operative Systems.