Research Implementation Hub
Contribute to trending research. Read, implement, and discuss.
*Requires login to implement papers
2017•Vaswani et al.
Attention Is All You Need
The landmark paper introducing the Transformer architecture, replacing RNNs with self-attention mechanisms.
1243
2020•Ho et al.
Denoising Diffusion Probabilistic Models
High-quality image synthesis using diffusion probabilistic models.
856
2021•Hu et al.
LoRA: Low-Rank Adaptation of Large Language Models
Efficient fine-tuning of LLMs by injecting trainable rank decomposition matrices.
2400