This is the Linux app named SimSiam whose latest release can be downloaded as simsiamsourcecode.tar.gz. It can be run online in the free hosting provider OnWorks for workstations.
Download and run online this app named SimSiam with OnWorks for free.
Urmați aceste instrucțiuni pentru a rula această aplicație:
- 1. Ați descărcat această aplicație pe computer.
- 2. Introduceți în managerul nostru de fișiere https://www.onworks.net/myfiles.php?username=XXXXX cu numele de utilizator pe care îl doriți.
- 3. Încărcați această aplicație într-un astfel de manager de fișiere.
- 4. Porniți emulatorul online OnWorks Linux sau Windows online sau emulatorul online MACOS de pe acest site web.
- 5. Din sistemul de operare OnWorks Linux pe care tocmai l-ați pornit, accesați managerul nostru de fișiere https://www.onworks.net/myfiles.php?username=XXXXX cu numele de utilizator dorit.
- 6. Descărcați aplicația, instalați-o și rulați-o.
SCREENSHOTS
Ad
SimSiam
DESCRIERE
SimSiam is a PyTorch implementation of “Exploring Simple Siamese Representation Learning” by Xinlei Chen and Kaiming He. The project introduces a minimalist approach to self-supervised learning that avoids negative pairs, momentum encoders, or large memory banks—key complexities of prior contrastive methods. SimSiam learns image representations by maximizing similarity between two augmented views of the same image through a Siamese neural network with a stop-gradient operation, preventing feature collapse. This elegant yet effective design achieves strong results in unsupervised learning benchmarks such as ImageNet without requiring contrastive losses. The repository provides scripts for both unsupervised pre-training and linear evaluation, using a ResNet-50 backbone by default. It is compatible with multi-GPU distributed training and can be fine-tuned or transferred to downstream tasks like object detection following the same setup as MoCo.
Categorii
- Minimal self-supervised learning framework without negative pairs or momentum encoders
- PyTorch-based implementation optimized for distributed multi-GPU training
- Fully reproducible training pipeline for ImageNet using default hyperparameters from the paper
- Includes both unsupervised pre-training and linear evaluation scripts
- LARS optimizer support via NVIDIA Apex for large-batch training
- Compatible with object detection transfer setups from MoCo
Limbaj de programare
Piton
Categorii
This is an application that can also be fetched from https://sourceforge.net/projects/simsiam.mirror/. It has been hosted in OnWorks in order to be run online in an easiest way from one of our free Operative Systems.