Skip to content
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions monai/networks/nets/diffusion_model_unet.py
Original file line number Diff line number Diff line change
Expand Up @@ -1527,9 +1527,9 @@ class DiffusionModelUNet(nn.Module):
upcast_attention: if True, upcast attention operations to full precision.
dropout_cattn: if different from zero, this will be the dropout value for the cross-attention layers.
include_fc: whether to include the final linear layer. Default to True.
use_combined_linear: whether to use a single linear layer for qkv projection, default to True.
use_combined_linear: whether to use a single linear layer for qkv projection, default to False.
use_flash_attention: if True, use Pytorch's inbuilt flash attention for a memory efficient attention mechanism
(see https://pytorch.org/docs/2.2/generated/torch.nn.functional.scaled_dot_product_attention.html).
(see https://pytorch.org/docs/2.2/generated/torch.nn.functional.scaled_dot_product_attention.html), default to False.
"""

def __init__(
Expand Down
Loading