Conference Presentation (Invited) FZJ-2023-04874

http://join2-wiki.gsi.de/foswiki/pub/Main/Artwork/join2_logo100x88.png
Novel Architecture Exploration - OpenGPT-X: Open Large Language Models

 ;

2023

WHPC@SC23: 16th International Women in HPC Workshop, WHPC@SC23, Denver, ColoradoDenver, Colorado, USA, 12 Nov 2023 - 17 Nov 20232023-11-122023-11-17 [10.34734/FZJ-2023-04874]

This record in other databases:

Please use a persistent id in citations: doi:

Abstract: The OpenGPT-X project is a German initiative with ten collaborators to build, train, and deploy a multilingual open-source language model. Models trained within the project will be used for pilot cases by industry partners and commercialized through the Gaia-X Federation. Due to the substantial memory and compute resources required for efficiently training large language models, high-performance computing systems such as JUWELS Booster are essential. This paper presents the results of the exploration of novel hardware architecture conducted within the scope of the project.


Contributing Institute(s):
  1. Jülich Supercomputing Center (JSC)
Research Program(s):
  1. 5112 - Cross-Domain Algorithms, Tools, Methods Labs (ATMLs) and Research Groups (POF4-511) (POF4-511)
  2. 5121 - Supercomputing & Big Data Facilities (POF4-512) (POF4-512)
  3. OpenGPT-X - Aufbau eines Gaia-X Knotens für große KI-Sprachmodelle und innovative Sprachapplikations-Services; Teilvorhaben: Optimierung und Skalierung auf großen HPC-Systemen (68GX21007F) (68GX21007F)
  4. ATML-X-DEV - ATML Accelerating Devices (ATML-X-DEV) (ATML-X-DEV)

Appears in the scientific report 2023
Database coverage:
OpenAccess
Click to display QR Code for this record

The record appears in these collections:
Document types > Presentations > Conference Presentations
Workflow collections > Public records
Institute Collections > JSC
Publications database
Open Access

 Record created 2023-11-27, last modified 2025-08-22


OpenAccess:
Download fulltext PDF
Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)