Skip to content

Instantly share code, notes, and snippets.

View PRITHIVSAKTHIUR's full-sized avatar
🔥
Making GPUs go brrrrrrrr.

PRITHIV SAKTHI U R PRITHIVSAKTHIUR

🔥
Making GPUs go brrrrrrrr.
View GitHub Profile
@PRITHIVSAKTHIUR
PRITHIVSAKTHIUR / run_finetune_vit.py
Last active October 6, 2025 11:28
A Comprehensive Single-Shot Fine-Tuning Code Snippet for ViT (vit-base-patch16-224-in21k) on Domain-Specific Downstream Image Classification Tasks.
# Fine-Tuning ViT for Image Classification | Script prepared by: hf.co/prithivMLmods
#
# Dataset with Train & Test Splits
#
# In this configuration, the dataset is already organized into separate training and testing splits. This setup is ideal for straightforward supervised learning workflows.
#
# Training Phase:
# The model is fine-tuned exclusively on the train split, where each image is paired with its corresponding class label.
#
# Evaluation Phase:
@PRITHIVSAKTHIUR
PRITHIVSAKTHIUR / run_finetune_siglip2.py
Last active October 6, 2025 11:27
A Comprehensive Single-Shot Fine-Tuning Code Snippet for SigLIP2 on Domain-Specific Downstream Image Classification Tasks.
# Fine-Tuning SigLIP2 for Image Classification | Script prepared by: hf.co/prithivMLmods
#
# Dataset with Train & Test Splits
#
# In this configuration, the dataset is already organized into separate training and testing splits. This setup is ideal for straightforward supervised learning workflows.
#
# Training Phase:
# The model is fine-tuned exclusively on the train split, where each image is paired with its corresponding class label.
#
# Evaluation Phase: