001     56461
005     20180211173333.0
024 7 _ |2 DOI
|a 10.1002/cpe.1124
024 7 _ |2 WOS
|a WOS:000248578200003
037 _ _ |a PreJuSER-56461
041 _ _ |a eng
082 _ _ |a 004
084 _ _ |2 WoS
|a Computer Science, Software Engineering
084 _ _ |2 WoS
|a Computer Science, Theory & Methods
100 1 _ |a Gerndt, M.
|b 0
|0 P:(DE-HGF)0
245 _ _ |a A test suite for parallel performance analysis tools
260 _ _ |a Chichester
|b Wiley
|c 2007
300 _ _ |a 1465 - 1480
336 7 _ |a Journal Article
|0 PUB:(DE-HGF)16
|2 PUB:(DE-HGF)
336 7 _ |a Output Types/Journal article
|2 DataCite
336 7 _ |a Journal Article
|0 0
|2 EndNote
336 7 _ |a ARTICLE
|2 BibTeX
336 7 _ |a JOURNAL_ARTICLE
|2 ORCID
336 7 _ |a article
|2 DRIVER
440 _ 0 |a Concurrency and Computation: Practice and Experience
|x 1532-0626
|0 17301
|y 11
|v 19
500 _ _ |a Record converted from VDB: 12.11.2012
520 _ _ |a Parallel performance analysis tools must be tested as to whether they perform their task correctly, which comprises at least three aspects. First, it must be ensured that the tools neither alter the semantics nor distort the run-time behavior of the application under investigation. Next, it must be verified that the tools collect the correct performance data as required by their specification. Finally, it must be checked that the tools perform their intended tasks and detect relevant performance problems. Focusing on the latter (correctness) aspect, testing can be done using synthetic test functions with controllable performance properties, possibly complemented by real-world applications with known performance behavior. A systematic test suite can be built from synthetic test functions and other components, possibly with the help of tools to assist the user in putting the pieces together into executable test programs. Clearly, such a test suite can be highly useful to builders of performance analysis tools. It is surprising that, up until now, no systematic effort has been undertaken to provide such a suite. In this paper we describe the APART Test Suite (ATS) for checking the correctness (in the above sense) of parallel performance analysis tools. In particular, we describe a collection of synthetic test functions which allows one to easily construct both simple and more complex test programs with desired performance properties. We briefly report on experience with MPI and OpenMP performance tools when applied to the test cases generated by ATS. Copyright (c) 2006 John Wiley & Sons, Ltd.
536 _ _ |a Scientific Computing
|c P41
|2 G:(DE-HGF)
|0 G:(DE-Juel1)FUEK411
|x 0
588 _ _ |a Dataset connected to Web of Science
650 _ 7 |a J
|2 WoSType
653 2 0 |2 Author
|a parallel performance analysis
653 2 0 |2 Author
|a automatic performance analysis
653 2 0 |2 Author
|a performance analysis tools
653 2 0 |2 Author
|a parallel applications
653 2 0 |2 Author
|a parallel programming
700 1 _ |a Mohr, B.
|b 1
|u FZJ
|0 P:(DE-Juel1)132199
700 1 _ |a Träff, J. L.
|b 2
|0 P:(DE-HGF)0
773 _ _ |a 10.1002/cpe.1124
|g Vol. 19, p. 1465 - 1480
|p 1465 - 1480
|q 19<1465 - 1480
|0 PERI:(DE-600)2052606-4
|t Concurrency and computation
|v 19
|y 2007
|x 1532-0626
856 7 _ |u http://dx.doi.org/10.1002/cpe.1124
909 C O |o oai:juser.fz-juelich.de:56461
|p VDB
913 1 _ |k P41
|v Scientific Computing
|l Supercomputing
|b Schlüsseltechnologien
|0 G:(DE-Juel1)FUEK411
|x 0
914 1 _ |y 2007
915 _ _ |0 StatID:(DE-HGF)0010
|a JCR/ISI refereed
920 1 _ |k ZAM
|l Zentralinstitut für Angewandte Mathematik
|d 31.12.2007
|g ZAM
|0 I:(DE-Juel1)VDB62
|x 0
970 _ _ |a VDB:(DE-Juel1)88601
980 _ _ |a VDB
980 _ _ |a ConvertedRecord
980 _ _ |a journal
980 _ _ |a I:(DE-Juel1)JSC-20090406
980 _ _ |a UNRESTRICTED
981 _ _ |a I:(DE-Juel1)JSC-20090406


LibraryCollectionCLSMajorCLSMinorLanguageAuthor
Marc 21