How a Newer Investigated a Java Out Of Memory Error.
So one fine day newer came to office and got an interesting problem from his team lead.
One particular API was giving Java Out Of memory Error with only 10 concurrent threads.
So the Newer started to investigate the problem
so at first he reaches to the server and try to collect stats about the particular JVM Process that was giving error . He Needed to get id of this process. So he found this java/bin/jps Utility of java that gives the process ids of all the java processes running on that system. So he got the id of that process.
Next he ran command java/bin/jstat with GC parameter to get the garbage collector stats of that java process. But oh snap there was nothing. That was odd.
So Newer tried to google it and found that if a process is running from a different user and if java/bin/jstat is being used with different user that there may be problem in getting result.
He used the top command to get the user of running process. It was tomcat 7 . So that was handy .
Now he just needed to login into system with that user. He tried to sudo su tomcat7.
But nothing happened.
What could it be …
He opened /etc/passwd file and he saw there is no bash for tomcat7 user. So now he cannot login as user tomcat7.
Now he decided to replicate the issue on the local system so he can get the issue investigated.
He could get the code but now the thing he was needed to start was jvm settings on the server so he could use those jvm settings to run process locally.
He found this ps utility of linux. He used it as ps -ef | grep tomcat7. E and f flags are used to show the full format and environment.
He found out that Max memory that as allowed to process was 128mb and everything all about that process.
Now he has jvm settings, code and everything which was needed to test this code locally.
He installed this tool visualVM tool that gives all this information about java process with pretty visualization.
He executed the code on local machine and started monitoring with visualVM. He went to monitor tab. VisualVM was showing him all the threads, heap, cpu and classes. There was an option for heap dump. He dumped the heap snapshot and started analyzing with Heap Analyzer.
He looked for the most dominating objects which were taking most space on heap. There were lot’s of objects of various classes . He checked with the help of regex if there was any mentions of his project classes. There were his classes but objects of these classes and retain able heap was negligible. Most dominating objects were from native java packages and some apis.
So there was nothing he could do. He reported that this was issue of less memory on which the process was running so to solve that problem they were needed to give more memory to process.
So this is how a simple Java Memory Out Of Error was investigated.
Now we have these tools in our Tool box
1. Jps.(for process id)
2. jstat (statistics about jvm process)
3. jmap(to take a heap dump)
4. jhat(to visualize the heap dump)
some tools like visualVm , jconsole and jvmmonitor provides all of this in a single app. But these are all gui based . So when you don’t have gui on any system than above mentioned command are the savior.
There is also option of profiling in visualVm , Where you see the time taken by every method and invocation count of every method. So next time when you see that something is taking time then you can really know what is taking time.
Well Written! Very easy to understand…