‘Catastrophic overtraining’ could harm large language AI models that are trained on more data for the sake of training
Researchers from top US universities warn extending pre-training can be detrimental to performance Too much pre-training can deliver worse performance…