Software FZJ-2025-02415

http://join2-wiki.gsi.de/foswiki/pub/Main/Artwork/join2_logo100x88.png
Scalasca Trace Tools: Toolset for scalable performance analysis of large-scale parallel applications (v2.6.2)

 ;  ;  ;  ;  ;  ;  ;  ;  ;  ;  ;  ;

2025

This record in other databases:

Please use a persistent id in citations: doi:

Abstract: Scalasca is a software tool that supports the performance optimization of parallel programs by measuring and analyzing their runtime behaviour. The analysis identifies potential performance bottlenecks – in particular those concerning communication and synchronization – and offers guidance in exploring their causes.  Scalasca supports the performance optimization of simulation codes on a wide range of current HPC platforms. Its powerful analysis and intuitive result presentation guides the developer through the tuning process. Scalasca targets mainly scientific and engineering applications based on the programming interfaces MPI and OpenMP, including hybrid applications based on a combination of the two. The tool has been specifically designed for use on large-scale systems, but is also well suited for small- and medium-scale HPC platforms. The software is available for free download under the New BSD open-source license.

Keyword(s): performance analysis ; Score-P ; Cube ; HPC ; MPI ; OpenMP ; trace tools


Contributing Institute(s):
  1. Jülich Supercomputing Center (JSC)
Research Program(s):
  1. 5112 - Cross-Domain Algorithms, Tools, Methods Labs (ATMLs) and Research Groups (POF4-511) (POF4-511)
  2. ATMLPP - ATML Parallel Performance (ATMLPP) (ATMLPP)
  3. ATMLAO - ATML Application Optimization and User Service Tools (ATMLAO) (ATMLAO)

Appears in the scientific report 2025
Click to display QR Code for this record

The record appears in these collections:
Document types > Other Resources > Software
Workflow collections > Public records
Institute Collections > JSC
Publications database

 Record created 2025-05-05, last modified 2025-05-06



Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)