Practical Differential Profiling Page: 4 of 12
This article is part of the collection entitled: Office of Scientific & Technical Information Technical Reports and was provided to Digital Library by the UNT Libraries Government Documents Department.
The following text was automatically extracted from the image on this page using optical character recognition software:
Based on this analysis, we are augmenting these tools to suit those usage pat-
terns better while retaining their ease-of-use. Our new toolset can be deployed
transparently to the user without requiring any workflow or instrumentation
changes on their side since it merely provides additional functionality. It will
therefore easily find acceptance by a large group of users.
In this work, we apply this strategy to profiling of parallel applications with
gprof, a command line driven profiler that is installed on almost all systems.
While it is limited in its scope, it has found a large acceptance among our code
teams because of its simplicity, wide installation base, and existing support by
almost any compiler. However, gprof (like most other profiling tools) does not
provide any direct support for the most common analysis step used in profiling:
the comparison of executions, e.g., before and after coding changes intended to
improve performance. Instead the user is left with having to compare large text
logs of profiles manually, which is both tedious and error-prone.
We therefore extend the gprof toolset to include differential profiling allowing
the user to directly compare two execution profiles as well as callgraphs from two
different application executions. In addition, we provide a graphical representa-
tion of both individual and differential callgraphs to visualize the often complex
information encoded in gprof s callgraph results.
Combined, this provides an easy way to study the impact of code optimiza-
tions and parameter changes, as well as code properties both within one rank
and across ranks. We will demonstrate this using four case studies covering var-
ious scenarios for single and multi-node performance analysis. In all cases, our
extensions concisely present the key differences between individual executions in
a few lines without the need for long manual searches.
2 Related Work
Only few tools support differential or comparative performance analysis. One of
the exceptions is OpenSpeedShop , a recently developed performance toolset
for Linux clusters, which includes the ability to align and contrast results from
multiple runs. Further, Karavanic has investigated difference operators for per-
formance event maps in Paradyn  as part of her Ph.D. thesis .
Both PerfDMF  and PerfTrack  provide a base infrastructure capa-
ble of supporting differential performance analysis. They both deploy relational
databases to store the results of performance analysis across multiple runs. This
data can later be queried and then compared using external tools.
Most other tools, however, only have the ability to work with data gathered
during a single run and leave the user with the task to manually contrast the
individual results. Due to the complexity and size of performance data, in par-
ticular from large scale parallel applications, this often tedious task risks missing
Here’s what’s next.
This article can be searched. Note: Results may vary based on the legibility of text within the document.
Tools / Downloads
Get a copy of this page or view the extracted text.
Citing and Sharing
Basic information for referencing this web page. We also provide extended guidance on usage rights, references, copying or embedding.
Reference the current page of this Article.
Schulz, M & De Supinski, B R. Practical Differential Profiling, article, February 4, 2007; Livermore, California. (digital.library.unt.edu/ark:/67531/metadc877450/m1/4/: accessed November 13, 2018), University of North Texas Libraries, Digital Library, digital.library.unt.edu; crediting UNT Libraries Government Documents Department.