How to Analyze a .Hprof File

How do I analyze a .hprof file?

If you want a fairly advanced tool to do some serious poking around, look at the Memory Analyzer project at Eclipse, contributed to them by SAP.

Some of what you can do is mind-blowingly good for finding memory leaks etc -- including running a form of limited SQL (OQL) against the in-memory objects, i.e.

SELECT toString(firstName) FROM com.yourcompany.somepackage.User

Totally brilliant.

How to analyze heap data from .hprof file and use it to reduce memory leaks?

There are many ways to find the root cause of a memory leak, like using a profiler such as JProfiler and simply applying what is described in this great video. You could also have a look to Eclipse Memory Analyzer also know as MAT that will be able to analyze your heap dump and propose potential causes of your memory leak as you can see in this video (you can find more information about the Suspect Report here). Another way could be to use Java Flight Recorder by applying this approach. Or using JVisualVM using the approach described in this video.

How to analyze the heap dump file generated by JMeter

Basically a heap dump is a snapshot of the memory of a Java™ process.

Reference: https://www.ibm.com/support/knowledgecenter/en/SS3KLZ/com.ibm.java.diagnostics.memory.analyzer.doc/heapdump.html

Reference: https://dzone.com/articles/java-heap-dump-analyzer-1

you can use Eclipse Memory Analyzer (MAT) for Heap Analysis

Reference : https://www.eclipse.org/mat/

Tool or tricks to analyze offline Java heap dumps (.hprof)

Eclipse Memory Analyzer does everything you need.

Tool for analyzing large Java heap dumps

Normally, what I use is ParseHeapDump.sh included within Eclipse Memory Analyzer and described here, and I do that onto one our more beefed up servers (download and copy over the linux .zip distro, unzip there). The shell script needs less resources than parsing the heap from the GUI, plus you can run it on your beefy server with more resources (you can allocate more resources by adding something like -vmargs -Xmx40g -XX:-UseGCOverheadLimit to the end of the last line of the script.
For instance, the last line of that file might look like this after modification

./MemoryAnalyzer -consolelog -application org.eclipse.mat.api.parse "$@" -vmargs -Xmx40g -XX:-UseGCOverheadLimit

Run it like ./path/to/ParseHeapDump.sh ../today_heap_dump/jvm.hprof

After that succeeds, it creates a number of "index" files next to the .hprof file.

After creating the indices, I try to generate reports from that and scp those reports to my local machines and try to see if I can find the culprit just by that (not just the reports, not the indices). Here's a tutorial on creating the reports.

Example report:

./ParseHeapDump.sh ../today_heap_dump/jvm.hprof org.eclipse.mat.api:suspects

Other report options:

org.eclipse.mat.api:overview and org.eclipse.mat.api:top_components

If those reports are not enough and if I need some more digging (i.e. let's say via oql), I scp the indices as well as hprof file to my local machine, and then open the heap dump (with the indices in the same directory as the heap dump) with my Eclipse MAT GUI. From there, it does not need too much memory to run.

EDIT:
I just liked to add two notes :

  • As far as I know, only the generation of the indices is the memory intensive part of Eclipse MAT. After you have the indices, most of your processing from Eclipse MAT would not need that much memory.
  • Doing this on a shell script means I can do it on a headless server (and I normally do it on a headless server as well, because they're normally the most powerful ones). And if you have a server that can generate a heap dump of that size, chances are, you have another server out there that can process that much of a heap dump as well.

analyse a HPROF memory dump file from command line programmatically

ParseHeapDump.sh does what you're looking for. As for the follow up question I'm not sure what format the index files are stored in.



Related Topics



Leave a reply



Submit