Skip to content

CGCL-codes/Vulcan

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

4 Commits
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

πŸ’« Vulcan Β· Class-Specific ViT Derivation πŸ’«

πŸŒ€ What is Vulcan?

πŸš€ Vulcan is a novel approach for deriving compact, class-specific Vision Transformers (ViTs) tailored for resource-constrained edge devices. 🎯 Given a pre-trained base ViT, Vulcan can derive a lightweight ViTs that focus on recognizing the target classes.

πŸ“‚ Project Structure

Folder/File Description
src/data Stores sub-task definitions and intermediate experimental results.
src/dataset Dataset loading, processing, and augmentation utilities.
src/engine Core training and evaluation pipelines.
src/method Core implementations of Vulcan, including CCNC and TNNR losses, adaptive configuration, and structured pruning.
src/model Model definitions and loading utilities for ViT and Swin backbones.
src/scripts Shell scripts for running Vulcan experiments with different models, tasks, and pruning configurations.
src/utils General utility functions for profiling, FLOPs/parameter calculation, memory analysis, and training support.
src/main.py Main entry point to run Vulcan, including post-training and pruning.

πŸš€ Quick Start

  1. Clone the repository

First, clone the NuWa project to your local machine:

git clone https://github.com/xxx/vulcan.git
cd vulcan/scripts
  1. Install required dependencies
  2. Run the pipeline
./vulcan_base.sh

πŸ“Ž Citation

If you find this code useful, please cite our paper:

@article{wei2026vulcan,
  title={Vulcan: Crafting compact class-specific Vision Transformers for edge intelligence},
  author={Wei, Ziteng and He, Qiang and Chen, Feifei and Duan, Ranjie and Li, Xiaodan and Li, Bin and Chen, Yuefeng and Xue, Hui and Jin, Hai and Yang, Yun},
  journal={International Conference on Learning Representations},
  year={2026}
}

About

Task-specific Model Derivation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors