Hauptseite > JuOSC (Juelich Open Science Collection) > OpenGPT-X - Training Large Language Models on HPC Systems |
Poster (After Call) | FZJ-2022-03599 |
; ; ; ; ;
2022
Please use a persistent id in citations: http://hdl.handle.net/2128/32006
Abstract: Artificial neural networks represent an HPC workload with increasing importance. In particular the field of Natural Language Processing (NLP) has been undergoing a revolution in recent years. The training of ever larger language models, such as GPT-3, demands large HPC resources and has the potential to greatly impact everyday technology. The OpenGPT-X project was established in 2022 and aims to not leave this field to large tech companies but to provide an open, publicly funded alternative based on European values. The Jülich Supercomputing Centre is a consortium partner providing HPC infrastructure for the pre-training of the models. We research the optimization potential in the training process for example by using novel accelerator architectures.
![]() |
The record appears in these collections: |