# Enable PrintMetaspaceStatistics in Java 8 for Memory Analysis
Your Java application crashes in production with:
java.lang.OutOfMemoryError: Metaspace
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at org.apache.catalina.loader.WebappClassLoaderBase.findClassInternal(WebappClassLoaderBase.java:2368)Or you notice the JVM process memory growing over time even though heap usage looks normal. The issue is in Metaspace - the memory area that stores class metadata in Java 8+.
Real Scenario: Tomcat Application Memory Leak
A financial services company deployed a Tomcat application that processes loan applications. Every few days, the application crashed with OutOfMemoryError: Metaspace. The heap looked fine (only 60% used), but the JVM process was consuming 4GB+ of memory.
Initial diagnosis showed:
$ jstat -gcutil 12345 1000 5
S0 S1 E O M CCS YGC YGCT FGC FGCT GCT
0.00 45.23 62.18 58.34 98.12 95.67 156 2.345 42 12.456 14.801
0.00 45.23 64.52 58.34 98.12 95.67 156 2.345 42 12.456 14.801
0.00 45.23 67.89 58.34 98.12 95.67 156 2.345 42 12.456 14.801The M column (Metaspace utilization) is at 98.12% - dangerously close to the limit.
The root cause: A third-party library was dynamically generating classes for each loan application type, and these classes were never unloaded.
What is Metaspace?
In Java 8, Metaspace replaced the Permanent Generation (PermGen). It stores:
- Class metadata (class names, methods, fields)
- Runtime constant pool
- Method bytecode
- JIT compiled code (in Code Cache)
Unlike PermGen, Metaspace is not part of the Java heap. It's allocated from native memory (off-heap), which means:
- It can grow beyond the heap size
- It's not limited by
-Xmx - You need
-XX:MaxMetaspaceSizeto limit it
Enabling PrintMetaspaceStatistics
Add the JVM flag to your startup arguments:
java -XX:+PrintMetaspaceStatistics \
-XX:+PrintGCDetails \
-XX:+PrintGCDateStamps \
-Xloggc:/var/log/app/gc.log \
-jar application.jarFor Tomcat, add to setenv.sh:
#!/bin/bash
export JAVA_OPTS="$JAVA_OPTS -XX:+PrintMetaspaceStatistics"
export JAVA_OPTS="$JAVA_OPTS -XX:+PrintGCDetails"
export JAVA_OPTS="$JAVA_OPTS -XX:+PrintGCDateStamps"
export JAVA_OPTS="$JAVA_OPTS -Xloggc:/var/log/tomcat/gc.log"Understanding the Output
When enabled, the JVM prints metaspace statistics after each garbage collection:
2026-04-23T10:30:15.123+0000: [GC (Allocation Failure) [PSYoungGen: 655360K->87328K(764672K)] 655360K->87416K(2513920K), 0.0356234 secs] [Times: user=0.11 sys=0.00, real=0.04 secs]
Metaspace used 28432K, capacity 28928K, committed 29184K, reserved 1075200K
class space used 3124K, capacity 3200K, committed 3328K, reserved 1048576KKey Metrics Explained
| Metric | Meaning | Why it matters |
|---|---|---|
| used | Bytes currently storing class metadata | This grows as classes are loaded |
| capacity | Bytes available without growing | When used = capacity, Metaspace expands |
| committed | Bytes actually allocated from OS | This is real memory consumption |
| reserved | Bytes reserved in virtual address space | Not actual memory, just address space |
Class Space vs Non-Class Space
Java 8+ splits metaspace into two areas:
- Class space: Stores Klass structures (internal JVM representation of classes)
- Non-class space: Stores method metadata, constant pools, annotations
If you see class space growing rapidly, you're loading too many classes. If non-class space grows, you have large methods or many annotations.
Diagnosing Metaspace Issues
Step 1: Check Current Metaspace Usage
```bash # Using jstat (quick check) jstat -gcutil <pid> 1000 10
# Using jcmd (detailed info) jcmd <pid> GC.heap_info ```
Output from jcmd:
Heap:
PSYoungGen total 764672K, used 423456K [0x000000076ab00000, 0x00000007a0000000, 0x00000007a0000000)
eden space 655360K, 64% used [0x000000076ab00000,0x000000078e3b45e8,0x0000000792b00000)
from space 109312K, 0% used [0x0000000792b00000,0x0000000792b00000,0x0000000799600000)
to space 109312K, 0% used [0x0000000799600000,0x0000000799600000,0x00000007a0000000)
ParOldGen total 1749504K, used 58234K [0x00000006c0000000, 0x000000072ab00000, 0x000000076ab00000)
object space 1749504K, 3% used [0x00000006c0000000,0x00000006c38d9878,0x000000072ab00000)
Metaspace used 28432K, committed 29184K, reserved 1075200K
class space used 3124K, committed 3328K, reserved 1048576KStep 2: Find What's Loading Classes
```bash # Enable verbose class loading jcmd <pid> VM.set_flag -flag TraceClassLoading
# Or start with verbose class loading java -verbose:class -jar application.jar ```
Look for repeated loading of similar classes:
[Loaded com.example.GeneratedClass12345 from __JVM_DefineClass__]
[Loaded com.example.GeneratedClass12346 from __JVM_DefineClass__]
[Loaded com.example.GeneratedClass12347 from __JVM_DefineClass__]
...Step 3: Count Loaded Classes
jcmd <pid> GC.class_histogram | head -30Output:
num #instances #bytes class name
----------------------------------------------
1: 45678 12345678 [C
2: 12345 9876543 [B
3: 34567 8765432 java.lang.String
4: 23456 7654321 java.util.HashMap$Node
5: 12345 6543210 com.example.GeneratedClass$$Lambda$123
...If you see many $$Lambda$ or generated classes, that's your leak source.
Step 4: Identify Class Loader Leaks
# List all class loaders
jcmd <pid> VM.classloadersOutput:
``` 1: org.apache.catalina.loader.ParallelWebappClassLoader @ 0x7a8b9c0d parent: java.net.URLClassLoader @ 0x6b7a8c9d classes: 12345 ...
2: org.apache.catalina.loader.ParallelWebappClassLoader @ 0x8b9c0d1e parent: java.net.URLClassLoader @ 0x6b7a8c9d classes: 8765 ... ```
Multiple WebappClassLoader instances with similar class counts indicate memory leaks from web application redeployments.
Solutions
Solution 1: Increase Metaspace Size
Quick fix for immediate relief:
java -XX:MetaspaceSize=256m \
-XX:MaxMetaspaceSize=512m \
-jar application.jarMetaspaceSize: Initial size (triggers early GC if exceeded)MaxMetaspaceSize: Maximum size (prevents unbounded growth)
Solution 2: Enable Class Unloading
In Java 8+, class unloading is enabled by default with G1GC. For CMS:
java -XX:+UseConcMarkSweepGC \
-XX:+CMSClassUnloadingEnabled \
-XX:MaxMetaspaceSize=512m \
-jar application.jarSolution 3: Fix the Application Code
Common cause: ThreadLocal with class references
```java // Problematic code public class RequestContext { private static final ThreadLocal<Map<String, Object>> context = ThreadLocal.withInitial(HashMap::new);
// If threads are pooled and never cleared, this leaks } ```
Fix:
```java public class RequestContext { private static final ThreadLocal<Map<String, Object>> context = ThreadLocal.withInitial(HashMap::new);
public static void clear() { context.remove(); // Must be called when request ends } }
// In a servlet filter public void doFilter(ServletRequest req, ServletResponse res, FilterChain chain) { try { chain.doFilter(req, res); } finally { RequestContext.clear(); // Always clean up } } ```
Solution 4: Fix Dynamic Class Generation
If using a library that generates classes (CGLIB, Javassist, ByteBuddy):
```java // Problematic: Creates new class for each call Enhancer enhancer = new Enhancer(); enhancer.setSuperclass(MyClass.class); enhancer.setCallback(new MethodInterceptor() { ... }); MyClass proxy = (MyClass) enhancer.create();
// Better: Cache the proxy private static final MyClass cachedProxy = createProxy();
private static MyClass createProxy() { Enhancer enhancer = new Enhancer(); enhancer.setSuperclass(MyClass.class); enhancer.setCallback(new MethodInterceptor() { ... }); return (MyClass) enhancer.create(); } ```
Solution 5: Configure for Container Environments
For Docker/Kubernetes with memory limits:
# Container has 4GB memory limit
# Heap: 2GB, Metaspace: 512MB, leave rest for OS and other memory
java -Xms1536m \
-Xmx1536m \
-XX:MetaspaceSize=128m \
-XX:MaxMetaspaceSize=512m \
-XX:CompressedClassSpaceSize=256m \
-XX:+UseContainerSupport \
-XX:MaxRAMPercentage=75.0 \
-jar application.jarProduction Monitoring Configuration
java -XX:+PrintMetaspaceStatistics \
-XX:+PrintGCDetails \
-XX:+PrintGCDateStamps \
-XX:+PrintGCTimeStamps \
-XX:+PrintGCApplicationStoppedTime \
-Xloggc:/var/log/app/gc.log \
-XX:+UseGCLogFileRotation \
-XX:NumberOfGCLogFiles=10 \
-XX:GCLogFileSize=100M \
-XX:MetaspaceSize=128m \
-XX:MaxMetaspaceSize=512m \
-XX:MinMetaspaceFreeRatio=40 \
-XX:MaxMetaspaceFreeRatio=70 \
-jar application.jarParameter Explanations
| Parameter | Value | Purpose |
|---|---|---|
MinMetaspaceFreeRatio | 40 | After GC, at least 40% of metaspace should be free |
MaxMetaspaceFreeRatio | 70 | Don't let more than 70% be free (avoid over-commitment) |
GCLogFileSize | 100M | Rotate logs when they reach 100MB |
NumberOfGCLogFiles | 10 | Keep 10 rotated log files |
Migration from Java 7 PermGen
If migrating from Java 7, replace PermGen flags:
| Java 7 (PermGen) | Java 8+ (Metaspace) |
|---|---|
-XX:PermSize=128m | -XX:MetaspaceSize=128m |
-XX:MaxPermSize=256m | -XX:MaxMetaspaceSize=256m |
-XX:+PrintGCDetails (includes PermGen) | -XX:+PrintMetaspaceStatistics |
Java 7 configuration:
java -XX:PermSize=128m -XX:MaxPermSize=256m -jar app.jarJava 8+ equivalent:
java -XX:MetaspaceSize=128m -XX:MaxMetaspaceSize=256m -jar app.jarTroubleshooting Checklist
- 1.Check metaspace usage:
- 2.```bash
- 3.jstat -gcutil <pid> | awk '{print $5}' # M column
- 4.
` - 5.Count loaded classes:
- 6.```bash
- 7.jcmd <pid> GC.class_stats | grep "Total" | awk '{print $2}'
- 8.
` - 9.Find duplicate class loaders:
- 10.```bash
- 11.jcmd <pid> VM.classloaders | grep -c "WebappClassLoader"
- 12.
` - 13.Check for generated classes:
- 14.```bash
- 15.jcmd <pid> GC.class_histogram | grep -E "(Lambda|Proxy|Generated)"
- 16.
` - 17.Monitor GC log for metaspace growth:
- 18.```bash
- 19.grep "Metaspace" /var/log/app/gc.log | tail -20
- 20.
` - 21.Verify MaxMetaspaceSize is set:
- 22.```bash
- 23.jcmd <pid> VM.flags | grep MaxMetaspaceSize
- 24.
`