Stackify is now BMC. Read theBlog

What is Java Memory Analysis

By: Iryen
  |  May 7, 2021
What is Java Memory Analysis

Java memory analysis is an important process in checking the performance of a Java application. It helps Java developers ensure the stability of the application by checking the memory consumption. There are several factors to look into when doing memory analysis. But to get to the bottom of this process, it is vital to learn first how memory works.

What is Memory?

Inexperienced developers often think that memory and memory analysis are the same across different programming languages. However, that is not the case. Yes, the concept of memory is almost the same across different languages, but the process of accessing and utilizing the memory varies accordingly.

But, what is the relationship between memory and building an application? 

Random-access memory (RAM) is one of the most valuable resources in building an application.  It works hand-in-hand with the Central Processing Unit (CPU) to effectively process inputs. Every form in a computer memory corresponds to a binary digit (bit). Information is stored in a cell, which is either 1 or 0. All programs and files consist of bits, and as the number of bits grows exponentially, developers constantly face the problem of memory consumption optimization. 

All About Java Memory

Before we delve into how Java memory analysis works, let’s first learn the difference between Stack and Heap memory in Java. 

Java Heap

The Java Virtual Machine (JVM) stores all objects and JRE classes created by the Java application in the heap. The heap memory uses dynamic allocation as there is no fixed pattern for allocating and deallocating blocks in memory. Developers can increase or decrease heap memory size by using 

JVM option -Xms and -Xmx. 

The heap offers full visibility to all threads. Additionally, JVM will throw the exception 

java.lang.OutOfMemoryError

when the application is out of memory. Memory allocated to the heap expires when one of the following events occurs:

  • Program terminated 
  • Memory is free 

Java Stack

The Java stack stores the order of method execution and local variables used in the application. It always stores blocks in the Last In First Out (LIFO) order. If a method is called, its stack frame is placed on top of the call stack. Also, the stack frame holds the current state of the method. Then, it includes which line of code is executing and the values of all local variables. Take note that the method at the top of the stack is always the current running method for that stack. 

Compared to the heap, threads have their own call stack. Also, developers can increase stack memory size by using 

JVM parameter -XSS.

Additionally, JVM will throw the exception

java.lang.StackOverFlowError

when there is a stack overflow error. In contrast to heap, the memory allocated to stack lives until the function returns.

Memory Affecting Performance

So, how does memory affect performance? To answer this question, it is important to know how an application uses memory. Apps don’t use RAM directly as each application can only access up to 4GB of virtual RAM on a 32-bit computer. 

The virtual RAM is divided into 4096-byte pages. The actual RAM houses a table that provides signals to the CPU on how to map application objects to the physical memory. Each page on the virtual RAM has an address and a flag whether it’s valid or not.

At the start, the operating system will only provide the application a few megabytes to work with. When the application needs more, it sends signals to the operating system, which the latter assigns an appropriate number of pages and gives the application the page address. 

This is the overview of the scheduling algorithm on how memory is used and if not implemented properly may lead to memory starvation. Memory starvation is often the cause of instability and unresponsiveness when building an application. For instance, memory starvation happens when a poorly designed multitasking application continuously switches between its first two tasks. The third queuing task has never initiated. The third task is now experiencing CPU time starvation.

However, the programming language design and features in dealing with memory starvation is another factor to consider. Running hundreds of queries but not closing the connection can lead to leaks. Memory leaks happen when the garbage collector is unable to remove objects from working memory. 

Java Application Monitoring

The goal of any Java memory analysis is to examine the application’s memory consumption. It includes analyzing the impact of the application response time or CPU usage on the memory capacity. For instance, the application response time may create memory shortages and leaks that result in major instability. 

Additionally, Java memory analysis includes ways to pinpoint processes that cause these memory problems. Such processes include the examination of excessive garbage collection. For instance, if garbage collection affects the response time, the solution is to optimize the configuration. 

Remember that for every change in the configuration, the effect must be to decrease the impact. Sometimes, the problem isn’t solved when configurations are optimized. In this case, consider other scenarios. For example, take a look at the allocation patterns and analyze the memory usage itself. 

Memory analysis is a broad topic but here are some important areas to consider:

Escape Analysis

Java objects are created and stored in the heap. In Java, developers don’t decide if an object should be generated in the stack. However, in practice, it should be desirable to allocate an object on the stack. Primarily because the memory allocation on the stack is cheaper than the memory allocation in the heap. Furthermore, deallocation on the stack is free and the stack is efficiently managed by the runtime.

Thus, the escape analysis is used to check whether an object is used only with a thread or method. The JVM performs escape analysis and will decide whether to create the object on the stack or not. Creating the object on the stack will increase the performance of the Java application.

Monitor Garbage Collection 

Generally, garbage collection is a process that collects the resources which are not currently assigned. Then, it initiates release so that it can be used again by the application.

As per the JVM garbage collector, it releases Java objects from memory as long as it does not hold any references at all. The JVM automatically recollects the memory which is not used anymore. As mentioned earlier, a working garbage collector should automatically release The memory for objects which are not referred to anymore. To see if the garbage collector is working, add the command line argument -verbose:gc to your virtual machine. 

Different languages have different mechanisms. For instance, Python has an additional feature called reference counting as part of its garbage collection module. On the other hand, the Java garbage collection features are stringent, which makes Java a memory-safe language. 

Now, if that is the case, why do we still need Java memory analysis?

The answer is simple. There are Java applications that conform to its memory management and perform well. However, not all Java applications are created equal. Hence, there are complex Java applications that have poor performance due to memory constraints.

This happens when an application has too many allocated objects and is allocated too quickly.  There is a high churn rate as the young generation fills up quickly and therefore the Garbage Collector (GC) must be triggered. Remember, a high churn rate might prevent optimum generation sizing. So, developers should fix this problem in their codes before attempting to optimize garbage collection itself. 

The Java GC can manage without overflowing the old generation. However, this approach is at the expense of the application’s performance. Consider that Java memory management doesn’t allow developers to go beyond the allocated memory. An error or exception is thrown when the memory is consumed beyond its allocation.

Check the overall memory usage of the app

jvisualvm is one of the memory analysis tools for Java used to analyze the runtime behavior of a Java application. It traces a running Java program, checking its memory and CPU consumption. Also, it is used to create a memory heap dump to analyze the objects in the heap.

Generally, a process within the application is considered extensive if it has a long runtime or high memory consumption.

The total used or free memory of a program can be obtained in the program via 

java.lang.Runtime.getRuntime(); 

Monitor the executed actions or methods

Part of your Java memory analysis is to monitor the executed actions or methods in your application. Oftentimes, developers use an event-based measurement method that analyzes individual method executions. 

This is done using logs and timestamping at the beginning and end of each method call. The result is the total number of times individual methods are called and the exact execution time for every call. 

The JVM Tool Interface (JVM TI) is a special callback inside the Java Runtime which is called at the beginning and end of method executions. However, this process has a high overhead that may affect the application’s runtime behavior. Hence, look for modern performance measurement methods that use bytecode instrumentation. It carries less overhead and provides greater application efficiency.

Check on the memory classes/libraries that are used or loaded by the app

When building a complex Java application, expect that something will eventually fail or you will encounter an OutOfMemoryException. Memory problems always present new and unexpected challenges. Hence, one of the best practices in memory analysis is to check on memory classes and libraries that your application is using or being loaded. 

For huge applications, doing it manually is impractical. Developers constantly deal with numbers of classes being loaded, several external and internal libraries, and other relevant metrics. Hence, you can seek help from Stackify Prefix. It provides deep-level performance details about your app. It performs code tracing including external libraries, SOAP/REST API calls, and other details from the most commonly used third-party libraries and frameworks.

Monitor the Java Threads 

Active Java threads are another JVM memory metric to monitor. Before delving into the concepts behind threads, here are two types of threads in Java to look into:

  • Daemon threads—are service providers to the user threads. JVM creates Daemon threads. A daemon thread’s life depends on user threads, hence they are low priority. They perform garbage collection and other housekeeping processes. 
  • User threads—are created by the application or the users. These are high-priority threads and the JVM will wait until they have finished their tasks.

Threads may make or break your application. If the number of threads is too high, it can slow down the response time. This means that the higher the number of threads, the higher processor utilization. The reason behind this is the processing power required by each thread. To frequently switch between threads, you need processing power.

When a high number of concurrent requests are expected, there is an increase in the number of threads used. The caveat is that this will decrease the response time for your application users.

You can manage threads accordingly. For example, threads are beneficial especially with working with concurrent tasks like fetching or writing data into a database. Developers use threads to improve the performance of the application, especially if they have I/O. However, take note that issues are prevalent when there are a lot of threads doing concurrent work.

Another important point to consider is thread overheads which produce a general slowdown in the application. This happens when creating and destroying a thread. Also, overhead happens when saving and restoring the state of the thread. Overheads are present because hardware resources are finite and shared. 

APMs like Stackify Retrace can check the number of threads in your Java application. It can provide historical reference and is configurable when your application goes beyond your average number of threads. It provides notifications and solutions to limit the number of threads you are running. 

JMX Monitoring Using Retrace

Java, being a robust programming language, provides tools and functions to deal with memory consumption. However, there is a relatively easy way to identify Java’s problematic codes. Try Stackify’s free code profiler, Prefix, to help you write better. It works perfectly fine with .NET, Java, PHP, Node.js, Ruby, and Python.

Furthermore, there are many ways to perform Java memory analysis. You can opt for more memory-centric profiling tools. Some tools specialize in Java memory leak analysis or a generic APM with stringent features to monitor your application.

Java Management Extensions (JMX) is a Java technology for monitoring and managing Java applications. It has wide acceptance among developers as it enables a generic management system. Also, it provides notifications when the application needs attention. Additionally, it changes the state of your application to provide solutions to problems.

JMX is a powerful tool. When JMX is paired with Retrace, it shows the overall performance metrics of your application. Start your 14-day FREE trial and use Retrace with JMX data today! 

Improve Your Code with Retrace APM

Stackify's APM tools are used by thousands of .NET, Java, PHP, Node.js, Python, & Ruby developers all over the world.
Explore Retrace's product features to learn more.

Learn More

Want to contribute to the Stackify blog?

If you would like to be a guest contributor to the Stackify blog please reach out to [email protected]