Highlights
- Pro
Pinned Loading
-
torchdistill
torchdistill PublicA coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Train…
-
sc2-benchmark
sc2-benchmark Public[TMLR] "SC2 Benchmark: Supervised Compression for Split Computing"
-
ladon-multi-task-sc2
ladon-multi-task-sc2 Public[WACV 2025] "A Multi-task Supervised Compression Model for Split Computing"
Python
-
supervised-compression
supervised-compression Public[WACV 2022] "Supervised Compression for Resource-Constrained Edge Computing Systems"
-
hnd-ghnd-object-detectors
hnd-ghnd-object-detectors Public[ICPR 2020] "Neural Compression and Filtering for Edge-assisted Real-time Object Detection in Challenged Networks" and [ACM MobiCom EMDL 2020] "Split Computing for Complex Object Detectors: Challen…
-
head-network-distillation
head-network-distillation Public[IEEE Access] "Head Network Distillation: Splitting Distilled Deep Neural Networks for Resource-constrained Edge Computing Systems" and [ACM MobiCom HotEdgeVideo 2019] "Distilled Split Deep Neural …
If the problem persists, check the GitHub status page or contact support.