Home
Uni-Logo
 

Eureka-Moments in Transformers: Multi-Step Tasks Reveal Softmax Induced Optimization Problems

International Conference on Machine Learning (ICML), 2024
Abstract: In this work, we study rapid improvements of the training loss in transformers when being confronted with multi-step decision tasks. We found that transformers struggle to learn the intermediate task and both training and validation loss saturate for hundreds of epochs. When transformers finally learn the intermediate task, they do this rapidly and unexpectedly. We call these abrupt improvements Eureka-moments, since the transformer appears to suddenly learn a previously incomprehensible concept. We designed synthetic tasks to study the problem in detail, but the leaps in performance can be observed also for language modeling and in-context learning (ICL). We suspect that these abrupt transitions are caused by the multi-step nature of these tasks. Indeed, we find connections and show that ways to improve on the synthetic multi-step tasks can be used to improve the training of language modeling and ICL. Using the synthetic data we trace the problem back to the Softmax function in the self-attention block of transformers and show ways to alleviate the problem. These fixes reduce the required number of training steps, lead to higher likelihood to learn the intermediate task, to higher final accuracy and training becomes more robust to hyper-parameters.
Paper Publisher's link Downloads

Images and movies

 

BibTex reference

@InProceedings{HSBB24,
  author       = "D. T. Hoffmann and S. Schrodi and J. Bratuli\ć and N. Behrmann and V. Fischer and T. Brox",
  title        = "Eureka-Moments in Transformers: Multi-Step Tasks Reveal Softmax Induced Optimization Problems",
  booktitle    = "International Conference on Machine Learning (ICML)",
  month        = " ",
  year         = "2024",
  note         = "https://arxiv.org/pdf/2310.12956",
  key          = "hoffmann_eureka",
  url          = "http://lmb.informatik.uni-freiburg.de/Publications/2024/HSBB24"
}

Other publications in the database