-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
auto parallel add dp demo. #64144
auto parallel add dp demo. #64144
Conversation
你的PR提交成功,感谢你对开源项目的贡献! |
@@ -31,7 +31,7 @@ def test_mlp(self): | |||
{"dtype": "float32", "seed": "2023"}, {"backend": ["gpu"]} | |||
) | |||
for envs in envs_list: | |||
# self._log_dir.name = "./log" | |||
self._log_dir.name = "./log" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
remember to remove this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done, thanks!
allgather_value = paddle._pir_ops.c_allgather( | ||
src_value, group.id, num_of_process, False | ||
) | ||
allgather_value.set_type(dst_type) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should think about a better method to remove this hack, like communication operation in pir using special infermeta_local
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
as discussed, I will systematically solve this problem in the next pr.
0e9e9b4
to
feddb28
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
PR Category
Auto Parallel
PR Types
Not User Facing
Description
add dp precision verify unit test.
Other
Pcard-67164