Skip to content

Support both use_calc_stream and sync_op in collective communication API#46761

Merged
sljlp merged 9 commits into
PaddlePaddle:developfrom
HermitSun:collective-stream-apis
Oct 11, 2022
Merged

Support both use_calc_stream and sync_op in collective communication API#46761
sljlp merged 9 commits into
PaddlePaddle:developfrom
HermitSun:collective-stream-apis

Conversation

@HermitSun
Copy link
Copy Markdown
Contributor

@HermitSun HermitSun commented Oct 6, 2022

PR types

New features

PR changes

APIs

Describe

In the new communication library, we designed ProcessGroup to manage different communication group. Inside each process_group has its own stream which all communications in this group will be done on this stream. For high level API, like distributed.all_reduce, we use use_calc_stream to indicate whether this operation is sync or not. Notice that frequently add unnecessary cuda events may lead to low performance on some model. In order to achieve high performance, this pr add a new API name distributed.stream.all_reduce. This new API provided use_calc_stream and sync_op both.

  • sync_op, indicate whether communication is sync or not.
  • use_calc_stream, do communicate on calc stream, save the time of switching stream. Only work when sync_op is true.

对应文档的中文api修改,见 PaddlePaddle/docs#5237

TODO:

  • reduce
  • broadcast
  • all_reduce
  • all_gather
  • alltoall
  • alltoall_single
  • sendrecv
  • scatter
  • reduce_scatter
  • _reduce_scatter_base

@paddle-bot
Copy link
Copy Markdown

paddle-bot Bot commented Oct 6, 2022

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

Comment thread python/paddle/distributed/communication/__init__.py Outdated
Comment thread paddle/fluid/pybind/distributed_py.cc Outdated
Comment thread paddle/fluid/pybind/distributed_py.cc Outdated
@HermitSun HermitSun force-pushed the collective-stream-apis branch 2 times, most recently from b5cad1a to 447a6e8 Compare October 8, 2022 12:10
LiYuRio
LiYuRio previously approved these changes Oct 9, 2022
Comment thread paddle/fluid/distributed/collective/Utils.h Outdated
Comment thread paddle/fluid/pybind/distributed_py.cc Outdated
@HermitSun HermitSun force-pushed the collective-stream-apis branch from 7069753 to 5a2685a Compare October 10, 2022 03:54
Copy link
Copy Markdown
Contributor

@gongweibao gongweibao left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Copy Markdown
Contributor

@XieYunshen XieYunshen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM
单测超时时间设置

Copy link
Copy Markdown
Contributor

@XiaoguangHu01 XiaoguangHu01 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@sljlp sljlp merged commit f94edc3 into PaddlePaddle:develop Oct 11, 2022
@HermitSun HermitSun deleted the collective-stream-apis branch October 11, 2022 11:17
HermitSun added a commit to HermitSun/Paddle that referenced this pull request Oct 12, 2022
XiaoguangHu01 pushed a commit that referenced this pull request Oct 17, 2022
* Support both use_calc_stream and sync_op in send recv APIs (#46023)

* Support both use_calc_stream and sync_op in allgather API (#46295)

* Support both use_calc_stream and sync_op in collective communication API (#46761)

* Move group and all reduce from collective to communication (#45848)

* Completes bfloat16 dtype for collective api in eager mode (#45844)

* Fix collective APIs cannot be recognized when building docs (#46962)

Co-authored-by: LiYuRio <63526175+LiYuRio@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants