0

分享

[学术论文] EDT: An Efficient Diffusion Transformer Framework Inspired by Human-like Sketching (NeurIPS 2024)

13 0
发表于 2025-4-2 14:04:55 | 显示全部楼层 阅读模式


Transformer-based Diffusion Probabilistic Models (DPMs) have shown more potential than CNN-based DPMs, yet their extensive computational requirements hinder widespread practical applications. To reduce the computation budget of transformer-based DPMs, this work proposes the Efficient Diffusion Transformer (EDT) framework.

The framework includes a lightweight-design diffusion model architecture, and a training-free Attention Modulation Matrix and its alternation arrangement in EDT inspired by human-like sketching. Additionally, we propose a token relation-enhanced masking training strategy tailored explicitly for EDT to augment its token relation learning capability.

Our extensive experiments demonstrate the efficacy of EDT. The EDT framework reduces training and inference costs and surpasses existing transformer-based diffusion models in image synthesis performance, thereby achieving a significant overall enhancement.

With lower FID, EDT-S, EDT-B, and EDT-XL attained speed-ups of 3.93x, 2.84x, and 1.92x respectively in the training phase, and 2.29x, 2.29x, and 2.22x respectively in inference, compared to the corresponding sizes of MDTv2. The source code is released at this https URL.

arxiv: https://arxiv.org/abs/2410.23788


本帖子中包含更多资源

您需要 登录 才可以下载或查看,没有账号?立即注册

×

回复

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

加入群聊

Copyright © 2021-2025 Open X-Humanoid 版权所有 All Rights Reserved.

相关侵权、举报、投诉及建议等,请发 E-mail:opensource@x-humanoid.com

Powered by Discuz! X5.0|京ICP备2024078606号-2|京公网安备11011202101078号

在本版发帖返回顶部
快速回复 返回顶部 返回列表