Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

#sdy Insert sdy.all_reduce ops in -sdy-insert-explicit-reshards pass. #420

Merged
merged 1 commit into from
Mar 18, 2025

Conversation

copybara-service[bot]
Copy link

#sdy Insert sdy.all_reduce ops in -sdy-insert-explicit-reshards pass.

If an op has a sharded reduction factor (e.g., contracting dim of dot) after reshards are inserted, we should insert a sdy.all_reduce with the axes that shard all reduction factors.

In the future, when we have support for unreduced axes in TensorShardingAttr, we can have this pass insert a reshard from unreduced axes to replicated axes, and insert the all-reduce later on.

…ass.

If an op has a sharded reduction factor (e.g., contracting dim of dot) after reshards are inserted, we should insert a `sdy.all_reduce` with the axes that shard all reduction factors.

In the future, when we have support for unreduced axes in TensorShardingAttr, we can have this pass insert a reshard from unreduced axes to replicated axes, and insert the all-reduce later on.

PiperOrigin-RevId: 737966111
@copybara-service copybara-service bot merged commit f27458f into main Mar 18, 2025
@copybara-service copybara-service bot deleted the test_737542586 branch March 18, 2025 12:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant