Course – LS – All

Get started with Spring and Spring Boot, through the Learn Spring course:

>> CHECK OUT THE COURSE

1. Overview

The Java Virtual Machine (JVM) is an abstract computing machine that enables a computer to run Java programs. The JVM is responsible for executing the instructions contained in the compiled Java code. To do so, it requires a certain amount of memory to store the data and instructions it needs to operate. This memory is divided into various areas.

In this tutorial, we’ll discuss the different types of runtime data areas and their purpose. Every JVM implementation must follow the specifications explained here.

2. Shared Data Areas

The JVM has several shared data areas that are shared amongst all threads running in the JVM. As a consequence, various threads can simultaneously access any of these areas.

2.1. Heap

The Heap is the runtime data area where all Java objects are stored. Thus, whenever we create a new class instance or array, the JVM will find some available memory in the Heap and assign it to the object. The Heap’s creation occurs at the JVM start-up, and its destruction happens at the exit.

As per the specification, an automatic management tool must handle the storage of the objects: this tool is known as the garbage collector.

There is no constraint about the Heap size in the JVM specifications. Memory handling is also left over to the JVM implementations. Nevertheless, if the garbage collector cannot reclaim enough free space to create a new object, the JVM throws an OutOfMemory error.

2.2. Method Area

The Method Area is a shared data area in the JVM that stores class and interface definitions. It is created when the JVM starts, and it is destroyed only when the JVM exits.

Concretely, the class loader loads the bytecode of the class and passes it to the JVM. The JVM then creates an internal representation of the class, which is used to create objects and invoke methods at runtime. This internal representation gathers information about the fields, methods, and constructors of the classes and interfaces.

Additionally, let’s point out that the Method Area is a logical concept. As a result, it may be part of the Heap in a concrete JVM implementation.

Once more, the JVM specifications don’t define the size of the Method Area, nor does it define the way the JVM handles memory blocks.

If the available space in the Method Area is not enough to load a new class, the JVM throws an OutOfMemory error.

2.3. Run-Time Constant Pool

The Run-Time Constant Pool is an area within the Method Area that contains symbolic references to the class and interface names, field names, and method names.

The JVM takes advantage of the creation of the representation for a class or interface in the Method Area to simultaneously create the Run-Time Constant Pool for this class.

When creating a Runtime Constant Pool, if the JVM needs more memory than is available in the Method Area, an OutOfMemory error is thrown.

3. Per-thread Data Areas

In addition to the shared runtime data areas, the JVM also uses per-thread data areas to store specific data for each thread. The JVM indeed supports many threads executions at the same time.

3.1. PC Register

Each JVM thread has its PC (program counter) register. Each thread executes the code of a single method at any given time. The behavior of the PC depends on the nature of the method:

  • For a non-native method, the PC register stores the address of the current instruction being executed
  • For a native method, the PC register has an undefined value

Lastly, let’s note that the PC register lifecycle is the same as the one of its underlying thread.

3.2. JVM Stack

Similarly, each JVM thread has its private Stack. The JVM Stack is a data structure that stores method invocation information. Each method call triggers the creation of a new frame on the stack to store the method’s local variables and the return address. Those frames can be stored in the Heap.

Thanks to the JVM Stack, the JVM can keep track of the execution of a program and log the stack trace on demand.

Once again, the JVM specification lets the JVM implementations decide how they want to handle the JVM Stack’s size and memory allocation.

A memory allocation error on the JVM Stack entails a StackOverflow error. However, if a JVM implementation allows the dynamic extension of its JVM Stack’s size, and if a memory error occurs during the extension, the JVM has to throw an OutOfMemory error.

3.3. Native Method Stack

A native method is a method written in another programming language than Java. These methods aren’t compiled to bytecode, hence the need for a different memory area.

The Native Method Stack is very similar to the JVM Stack but is only dedicated to native methods.

The purpose of the Native Method Stack is to keep track of the execution of a native method.

The JVM implementation can decide on its own the size of the Native Method Stack and how it handles memory blocks.

As for the JVM Stack, a memory allocation error on the Native Method Stack leads to a StackOverflow error. On the other hand, a failed attempt to increase the Native Method Stack’s size leads to an OutOfMemory error.

To conclude, let’s note that the specification highlights that a JVM implementation could decide not to support native method calls: such an implementation wouldn’t need to implement a Native Method Stack.

4. Conclusion

In this tutorial, we’ve elaborated on the different types of runtime data areas and their purpose. These areas are crucial for the proper functioning of the JVM. Understanding them can help optimize the performance of Java applications.

Course – LS – All

Get started with Spring and Spring Boot, through the Learn Spring course:

>> CHECK OUT THE COURSE
res – REST with Spring (eBook) (everywhere)
Comments are open for 30 days after publishing a post. For any issues past this date, use the Contact form on the site.