This is the Linux app named LLaMA-Factory whose latest release can be downloaded as v0.9.3_Llama4,Gemma3,Qwen3,InternVL3,Qwen2.5-Omnisourcecode.tar.gz. It can be run online in the free hosting provider OnWorks for workstations.
Download and run online this app named LLaMA-Factory with OnWorks for free.
Follow these instructions in order to run this app:
- 1. Downloaded this application in your PC.
- 2. Enter in our file manager https://www.onworks.net/myfiles.php?username=XXXXX with the username that you want.
- 3. Upload this application in such filemanager.
- 4. Start the OnWorks Linux online or Windows online emulator or MACOS online emulator from this website.
- 5. From the OnWorks Linux OS you have just started, goto our file manager https://www.onworks.net/myfiles.php?username=XXXXX with the username that you want.
- 6. Download the application, install it and run it.
SCREENSHOTS:
LLaMA-Factory
DESCRIPTION:
LLaMA-Factory is a fine-tuning and training framework for Meta's LLaMA language models. It enables researchers and developers to train and customize LLaMA models efficiently using advanced optimization techniques.
Features
- Supports fine-tuning of LLaMA models for specific tasks
- Optimized training with LoRA and QLoRA to reduce resource usage
- Provides pre-configured training scripts for faster setup
- Multi-GPU and distributed training support for scalability
- Integrates seamlessly with Hugging Face Transformers
- Compatible with FP16 and BF16 precision for efficient computation
Programming Language
Python
Categories
This is an application that can also be fetched from https://sourceforge.net/projects/llama-factory.mirror/. It has been hosted in OnWorks in order to be run online in an easiest way from one of our free Operative Systems.