Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Auto scaling factor tuning on FP8 weight gradients reduction for Megatron-LM #21

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

wkcn
Copy link

@wkcn wkcn commented Dec 8, 2023

Related PR in MS-AMP repo: Azure/MS-AMP#140

Add 3 arguments in Megatron-LM patch.

+    # Auto scaling factor tuning for FP8 collective communication
+    group.add_argument('--wgrad-auto-scaling', action='store_true', default=False,
+                       help='whether to enable auto scaling factor tuning on weight gradients reduction')
+    group.add_argument('--wgrad-auto-scaling-freq', type=int, default=10,
+                       help='the frequency of checking whether overflow exists in the result of weight gradients reduction')
+    group.add_argument('--wgrad-auto-scaling-ratio', type=float, default=1e-3,
+                       help='the threshold of overflow ratio for auto scaling factor tuning on weight gradients reduction')
+    group.add_argument('--wgrad-auto-scaling-window', type=int, default=100,
+                       help='the window size for auto scaling factor tuning on weight gradients reduction')

@wkcn wkcn requested review from tocean and guoshzhao December 12, 2023 06:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant