Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[docs] refactoring docstrings in ./src/diffusers/models/transformers/auraflow_transformer_2d.py #9715

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

ahnjj
Copy link
Contributor

@ahnjj ahnjj commented Oct 19, 2024

What does this PR do?

Fixes #9567

(I reopened PR with same doc since my previous PR may occur sync issue. sorry for confusion!)

Fixes # (issue)

  • Unified two expressions with the same meaning into one. (arguments / parameters -> args)
  • Concise Summary and argument descriptions
  • Corrected Grammar

Before submitting

Who can review?

This PR tries to attempt a solution at one of the submodules listed in #9567 so I think @a-r-r-o-w is the best to review it. Alongside the same, @charchit7 @yijun-lee and @SubhasmitaSw were also working on the same, so just a ping for the update on the same.

…auraflow_transformer_2d.py

I reopened this PR since the previous one may occur sync issue.
@ahnjj ahnjj marked this pull request as ready for review October 19, 2024 05:36
Copy link
Member

@a-r-r-o-w a-r-r-o-w left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, great work with improving these!

processor (`dict` of `AttentionProcessor` or only `AttentionProcessor`):
The instantiated processor class or a dictionary of processor classes that will be set as the processor
Args:
processor (Union[`dict`, `dict[`AttentionProcessor`]`]):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
processor (Union[`dict`, `dict[`AttentionProcessor`]`]):
processor (`AttentionProcessor` or `Dict[str, AttentionProcessor]`]):

@@ -347,8 +347,9 @@ def __init__(
def attn_processors(self) -> Dict[str, AttentionProcessor]:
r"""
Returns:
`dict` of attention processors: A dictionary containing all attention processors used in the model with
indexed by its weight name.
[`dict[`attention processors`]`]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This would cause tests to fail too as it requires all implementation that copy from UNet2DConditionalModel.attn_processors to have the same docs and implementation. This can be changed, but you need to make the modification in the mentioned file and then run make fix-copies (can do in a separate PR)

@@ -405,8 +406,10 @@ def fn_recursive_attn_processor(name: str, module: torch.nn.Module, processor):
# Copied from diffusers.models.unets.unet_2d_condition.UNet2DConditionModel.fuse_qkv_projections with FusedAttnProcessor2_0->FusedAuraFlowAttnProcessor2_0
def fuse_qkv_projections(self):
"""
Enables fused QKV projections. For self-attention modules, all projection matrices (i.e., query, key, value)
are fused. For cross-attention modules, key and value projection matrices are fused.
Enables fused QKV projections.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this will cause test to break because it is # Copied from elsewhere. This would require the implementations and docs to be the same as every other occurrence

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[community] Improving docstrings and type hints
3 participants