Learning Rate Schedules

Linear Warmup With Linear Decay

Linear Warmup With Linear Decay is a learning rate schedule in which we increase the learning rate linearly for $n$ updates and then linearly decay afterwards.

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Retrieval 124 13.25%
Language Modelling 96 10.26%
Question Answering 63 6.73%
Large Language Model 39 4.17%
Sentence 29 3.10%
Text Classification 28 2.99%
Sentiment Analysis 26 2.78%
Information Retrieval 24 2.56%
Text Generation 22 2.35%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories