FZ-Juelich
The Research Centre
Research
Knowledge
Opportunities
Portal auf-/zuklappen
  Print Version Search deutsch  

Scientific Report 2003

Homepage


Matter

Energy

Information

I01 Materials, Processes and Devices for Micro- and Nanoelectronics

I03 Scientific Computing

Life

Environment

Systems Research: Technology, Environment, Society


HGF Classification


Institutes


Additional Informations

Download
description publications patents   
   > Information > Scientific Computing
Main area of research: Information

R&D project: I03 Scientific Computing

Participating institutes:  ZAM

In charge: Dr. Dr. T. Lippert, ZAM, th.lippert@fz-juelich.de



HGF - Research Field / Programme / Topic(s)
4 Key Technologies
4.1 Scientific Computing
4.1.1 National Supercomputing Centre
John von Neumann - Institute for Computing (NIC)
4.1.2 Computational Science, Algorithms and Architectures
4.1.3 Grid Computing

Aims and Objectives

Scientific Computing has evolved as the third method of scientific research and hence as a strategic key technology. The Research Centre Jülich uses this technology intensively in its research and offers it to the German scientific community through the John von Neumann Institute for Computing.

The objective of this R&D project is the continuing development of the Jülich supercomputing centre aiming at maintaining and strengthening its leading role as one of three German national high-performance computing centres. This task is carried out by the Central Institute for Applied Mathematics. It includes operation, enhancement, and optimisation of the supercomputing systems, the development of the methodology of their usage for the computational sciences as well as the integration of computers and data in Grid systems. These activities are complemented by educational offers in high-performance scientific computing.


Significant Results in 2003

National Supercomputing Centre
John von Neumann-Institute for Computing (NIC)

The work was focused on the preparations of the installation of the new supercomputer delivered at the year end 2003. Right on schedule a new machine room was completed providing a floor space of 1000 m2 and the facilities for the operation of large air-cooled systems with an energy consumption of initially 500 kVA.

The new supercomputer IBM eServer Cluster 1600 consists of 41 model p690 nodes, each with 32 POWER4+ (1,7 GHz) processors and 128 GBytes of shared memory. The nodes are interconnected by the High Performance Switch (HPS). The system has a peak performance of about 9 TeraFLOPS and is currently the most powerful computer in Europe. Applications can utilize more than 1000 processors and 5 TeraBytes of memory. A parallel file system provides access to disk space of more than 60 TeraBytes and to an integrated automatic tape storage with a capacity of one PetaByte.

To prepare for the setup of the production environment, the new HPS and the associated management software were tested on two nodes. In mid-2003 a system consisting of six p690 nodes was installed and offered to the users. With its peak performance of one TeraFLOPS it replaced the older one of ZAM's two CRAY T3E systems.

Further development activities included the automatic control of the cluster system especially in case of failures of single components, the installation of a global parallel file system, the performance optimisation of the system and its applications, and the migration of data from the Cray systems.

To enable cooperating communication-intensive applications a high-capacity dark-fibre communication link was planned and established between FZJ, RWTH Aachen and FH Aachen/Jülich. The increase in performance and flexibility of the connection to the DFN-node in Aachen will allow participation in national and European research projects in networking and Grid computing.

Computational Science, Algorithms and Architectures

Quantum chemistry: The TURBOMOLE program package was enhanced by parallelising the RIDFT module (resolution of identity density functional theory) including the multipole approximation. It now allows near real-time DFT calculations of molecules with state-of-the-art system sizes up to 1000 atoms.

Structural mechanics: At the end of a three-year cooperation with SERC, India, and INTES GmbH, Stuttgart, the functionality as well as the efficiency of the finite element program FINEART was extended from 2D to large 3D models.

Coulomb interactions: The Continuous Fast Multipole Method for the calculation of the Coulomb interaction between smooth charge distributions was integrated into the quantum chemistry program package TURBOMOLE. A multigrid scheme was developed for the treatment of long-range interactions in many-particle systems.
The stability of Thymine doped with highly positive charged iodine was studied using semi-empirical methods. The calculations indicate a Coulomb explosion for a positive charge larger than 5 elementary charges.
A study of proton acceleration from laser-irradiated foil and wire targets was carried out using the ZAM parallel tree code PEPC.

Performance analysis: Support for the analysis of hardware counter data was added to the automatic performance analysis environment KOJAK. In collaboration with IBM a performance tool for OpenMP applications was designed and implemented which is based on binary instrumentation and the POMP monitoring interface.

Virtual-reality techniques: A visualisation of magneto-hydrodynamics data was implemented in cooperation with the Moscow State University. Together with the Aachen University of Technology shared scene graphs were realised. The ZAM library VISIT enabled the steering of astrophysical simulations.

Data mining: Within the GALA project with Grünenthal GmbH different algorithms for binary classification by support vector machines, a method of machine learning, were implemented and applied to pharmaceutical research data.

Grid Computing

FZJ has gained world-wide visibility in the Grid computing community through the UNICORE software developed under the leadership of ZAM and is a sought-after partner in international Grid projects. ZAM participates in seven positively evaluated proposal in the 6th EU framework programme.

Within the EUROGRID project an alternate file transfer mechanism was devised for UNICORE and implemented using GridFTP resulting in a substantial speed-up.

The visualisation software VISIT was coupled with UNICORE and demonstrated successfully in combination with Access Grid technology.

Project GRIP enhanced UNICORE for the Open Grid Services Architecture and integrated Grid services based on the new OGSI standard; ZAM leads and contributed to related standards work.

Project OpenMolGRID developed extensions to UNICORE to support the molecular design process. In particular the access to data bases of chemical and physical properties of molecules was implemented and the support for complex workflows was integrated.

Project PAB developed concepts and tools to manage network domains based on MPLS (multi protocol label switching). In addition, MPLS-related signaling techniques were analysed in detail.


Top of page
Research Centre Jülich
52425 Jülich, Germany
Deutsche Version
Sitemap
Imprint
23.03.04
WEB Admin