Pré-publication, Document de travail, AO, AO, Calcul parallèle, distribué et partagé, Computer Science, Distributed, Parallel, and Cluster Computing
Optimizing Distributed Training on Frontier for Large Language Models
Sajal Dash, Isaac Lyngaas, Junqi Yin, Xiao Wang, Romain Egele, et al.. Optimizing Distributed Training on Frontier for Large Language Models. 2024. ⟨hal-04393799⟩