Attention Dropout is a type of dropout used in attention-based architectures, where elements are randomly dropped out of the softmax in the attention equation. For example, for scaled-dot product attention, we would drop elements from the first term:
$$ {\text{Attention}}(Q, K, V) = \text{softmax}\left(\frac{QK^{T}}{\sqrt{d_k}}\right)V $$
Paper | Code | Results | Date | Stars |
---|
Task | Papers | Share |
---|---|---|
Retrieval | 78 | 9.50% |
Language Modelling | 68 | 8.28% |
Question Answering | 48 | 5.85% |
Large Language Model | 39 | 4.75% |
Sentence | 27 | 3.29% |
In-Context Learning | 22 | 2.68% |
Text Generation | 22 | 2.68% |
Information Retrieval | 17 | 2.07% |
Prompt Engineering | 16 | 1.95% |