Skip to main content

Heap Dump Snapshots

Posted by kellyohair on September 30, 2005 at 12:41 PM PDT

Alan recently blogged about

heap dumps are back

and Sundar blogged about

OQL
.
These all use the old and historic

HPROF

binary heap dump format as the "heap dump snapshot", which is essentially a complete dump of all the objects in the Java heap at a given time.

    If you are familiar with core dumps, and the Solaris utility gcore,
    you'll understand what we are trying to do here, at least in a basic way. This is not a new concept by any means.
    By creating "snapshots in time" of the heap, this allows you to
    browse the data without disturbing a running process, or allows
    you to do some postmortem processing on a crashed process.

We also added Bill Foote's valuable but historic
Heap Analysis Tool (HAT)
to view, inspect, and browse these heap dumps.
Binary dumps aren't much good without some kind of viewer.
HAT has been released in the latest
Mustang JDK
as jhat.
So if you have the latest Mustang and you can use HPROF,
you can very easily
browse the heap state at the time your java program exits with:

java -Xrunhprof:format=b,file=snapshot1.hprof Classname
jhat snapshot1.hprof

Then startup your browser and go to http://localhost:7000, or replace localhost with your machine name and browse on any machine you want.
If you haven't tried this, you should give it a spin.
If you don't want the heap dump at exit, then send your Java process
a ^C (control-C), and HPROF will get you a heap dump file on demand.

Don't want to use the HPROF startup option? You don't have to.
Try using the 'jmap' utility:

jmap -dump:format=b,file=snapshot2.jmap PID_OF_PROCESS
jhat snapshot2.jmap

It doesn't matter how the JVM was started, jmap can get you a heap dump snapshot.
The jmap heap.bin files should contain all the primitive data, but won't
include any stack traces as to where the objects have been created.

The HPROF in Build 55 of Mustang should include all the primitive data
and stack traces.

    Prior to Build 55 and in JDK 5.0 the primitive data (characters, integers, etc.) was not included in the HPROF generated files.
    This was understood in the JDK 5.0 release
    as a limitation we needed to fix.

So in doing this work, we have leveraged the original
HPROF binary heap dump
file format as the basic snapshot of the heap.
Beyond Mustang, we'd like to do a new and more detailed
specification of a binary heap format file, making it
more available, and fixing some of the basic problems
with the format.
This format hasn't changed, and in theory HPROF binary format
files from JDK 1.2 could still be used as input HAT or the
Mustang jhat, not that I'd recommend it.
The versions of HPROF prior to JDK 5.0 used JVMPI, and the quality
of the file contents can't be guaranteed.

An improved format with a set of APIs and tools around
it could serve as a basis for IDE's or 3rd party tools to
provide help in tracking down heap related problems.
We'd like to hear from anyone that might like to participate
in defining this new file format or APIs, and will probably start a
small java.net project as a way to get more involvement from the
Java community. Stay tuned.

One last plug, sorry. :^)

The JDK 5.0

Troubleshooting Guide

also contains lots of information in troubleshooting problems
in JDK 5.0 (applies to JDK 6.0 too).
If you have never looked at this document, it's worth a scan.
It might answer some of those age old questions you might have
had when your Java process core dumped or crashed, which hopefully
doesn't happen very often anymore.

Alan has recently requested

input on the new JDK 6.0 Troubleshooting Guide
.
We really want to hear from users as to what is important in this area, so send us your ideas!
Tell us what we can improve or document to make your Java development
job easier.

Related Topics >>