What's Actually Happening
Python memory profiler tools fail to run, show no output, or produce incorrect results. Profiling either crashes immediately, shows empty results, or the profiler import itself fails.
The Error You'll See
Import error:
```python import memory_profiler
ModuleNotFoundError: No module named 'memory_profiler' ```
Profiler shows no output:
```python @profile def my_function(): data = [i for i in range(100000)] return data
my_function() # No output, no profiling data ```
Permission denied:
```bash $ python -m memory_profiler script.py
PermissionError: [Errno 13] Permission denied: '/proc/12345/maps' ```
Empty or zero values:
Line # Mem usage Increment Occurrences Line Contents
=============================================================
1 0.0 MiB 0.0 MiB 1 @profile
2 def my_function():
3 0.0 MiB 0.0 MiB 1 pass
# All zeros, no actual profilingWhy This Happens
- 1.Package not installed - memory_profiler not in environment
- 2.Wrong Python version - Incompatible Python version
- 3.Missing psutil - Dependency not installed
- 4.Permission issues - Cannot read /proc on Linux
- 5.Decorator not applied - @profile not recognized
- 6.Run as script - Profiler requires -m flag
Step 1: Install Memory Profiler
```bash # Install memory_profiler pip install memory_profiler
# Install with recommended dependencies pip install memory-profiler psutil
# For development pip install memory-profiler[dev]
# Check installation python -c "import memory_profiler; print(memory_profiler.__version__)"
# Verify psutil installed python -c "import psutil; print(psutil.Process().memory_info())"
# For conda users conda install -c conda-forge memory_profiler ```
Step 2: Basic Usage
```python # Create profile_test.py
from memory_profiler import profile
@profile def my_function(): a = [1] * (10 ** 6) b = [2] * (2 * 10 ** 7) del b return a
if __name__ == '__main__': my_function()
# Run with profiler # python -m memory_profiler profile_test.py
# Output: # Line # Mem usage Increment Occurrences Line Contents # ============================================================= # 3 38.8 MiB 38.8 MiB 1 @profile # 4 def my_function(): # 5 46.5 MiB 7.7 MiB 1 a = [1] * (10 ** 6) # 6 199.6 MiB 153.1 MiB 1 b = [2] * (2 * 10 ** 7) # 7 46.5 MiB -153.1 MiB 1 del b # 8 46.5 MiB 0.0 MiB 1 return a ```
Step 3: Fix Import Errors
```python # If @profile not recognized, import explicitly:
# Option 1: Import from module from memory_profiler import profile
@profile def my_function(): pass
# Option 2: Use decorator directly import memory_profiler
@memory_profiler.profile def my_function(): pass
# Option 3: Profile specific function programmatically from memory_profiler import memory_usage
def my_function(): data = [i for i in range(1000000)] return sum(data)
mem_usage = memory_usage((my_function, (), {}), interval=0.1) print(f'Max memory usage: {max(mem_usage)} MiB') ```
Step 4: Fix Permission Issues
```bash # On Linux, profiler needs access to /proc
# Check permissions ls -la /proc/self/maps
# Run with appropriate permissions sudo python -m memory_profiler script.py
# Or add user to appropriate group sudo usermod -a -G proc <username>
# For Docker containers, run with: docker run --privileged ...
# Or use alternative: tracemalloc (built-in) import tracemalloc
tracemalloc.start()
# Your code data = [i for i in range(1000000)]
snapshot = tracemalloc.take_snapshot() top_stats = snapshot.statistics('lineno')
for stat in top_stats[:10]: print(stat) ```
Step 5: Use Tracemalloc Alternative
```python # Built-in memory profiler (Python 3.4+) import tracemalloc
# Start tracing tracemalloc.start()
# Your code def create_large_list(): return [i for i in range(1000000)]
data = create_large_list()
# Take snapshot snapshot = tracemalloc.take_snapshot()
# Get top memory allocations top_stats = snapshot.statistics('lineno') print("[ Top 10 ]") for stat in top_stats[:10]: print(stat)
# Compare snapshots tracemalloc.start() snapshot1 = tracemalloc.take_snapshot()
# Your code data2 = [i for i in range(500000)]
snapshot2 = tracemalloc.take_snapshot() top_stats = snapshot2.compare_to(snapshot1, 'lineno')
print("[ Memory growth ]") for stat in top_stats[:10]: print(stat) ```
Step 6: Use cProfile Alternative
```python # For CPU profiling (different from memory) import cProfile import pstats
def my_function(): data = [i for i in range(1000000)] return sum(data)
# Profile profiler = cProfile.Profile() profiler.enable()
my_function()
profiler.disable() stats = pstats.Stats(profiler) stats.sort_stats('cumulative') stats.print_stats(10)
# Or from command line: # python -m cProfile -s cumulative script.py ```
Step 7: Profile with Line Profiler
```bash # Install line_profiler pip install line_profiler
# Create script.py from line_profiler import LineProfiler
def slow_function(): total = 0 for i in range(10000): total += i return total
lp = LineProfiler() lp_wrapper = lp(slow_function) lp_wrapper() lp.print_stats()
# Or use kernprof from command line: # Add @profile decorator (no import needed) @profile def slow_function(): total = 0 for i in range(10000): total += i return total
if __name__ == '__main__': slow_function()
# Run: # kernprof -l -v script.py ```
Step 8: Profile Memory with objgraph
```bash # Install objgraph pip install objgraph
# Visualize object references import objgraph
# Create objects data = [i for i in range(1000)]
# Count objects objgraph.show_most_common_types()
# Show reference graph objgraph.show_refs([data], filename='refs.png')
# Show back references objgraph.show_backrefs([data], filename='backrefs.png')
# Find growth objgraph.show_growth() ```
Step 9: Profile with Fil
```bash # Install Fil profiler pip install filprofiler
# Profile script fil-profile run script.py
# Creates report in fil-result/ directory # Open fil-result/peak-memory.svg for visualization
# In code: import filprofiler
def my_function(): data = [i for i in range(1000000)] return data
with filprofiler.profile(): my_function() ```
Step 10: Debug Profiler Issues
```python # Debug why profiler not working
# Check if profiler runs from memory_profiler import profile import sys
print(f"Python: {sys.version}") print(f"Profiler: {profile}")
# Check if function is being profiled @profile def test(): return 1
print(f"Function wrapped: {hasattr(test, '__wrapped__')}")
# Test minimal profile from memory_profiler import memory_usage
def simple(): x = [0] * 10000 return x
result = memory_usage((simple, (), {}), retval=True, max_usage=True) print(f"Memory used: {result} MiB")
# Check environment import os print(f"PYTHONPATH: {os.environ.get('PYTHONPATH')}") print(f"sys.path: {sys.path}")
# Force output import io import sys from contextlib import redirect_stdout
f = io.StringIO() with redirect_stdout(f): # Profile code here pass output = f.getvalue() print(output) ```
Python Memory Profiler Checklist
| Check | Command | Expected |
|---|---|---|
| Package installed | pip list | memory_profiler |
| Import works | import memory_profiler | No error |
| psutil installed | import psutil | Works |
| Run with -m | python -m memory_profiler script.py | Output shown |
| Decorator applied | @profile | Function wrapped |
| Permissions | ls /proc/self/maps | Readable |
Verify the Fix
```bash # After fixing profiler issues
# 1. Run simple test python -c " from memory_profiler import profile @profile def test(): x = [0] * 10000 return x test() " // Shows memory usage output
# 2. Test with script python -m memory_profiler myscript.py // Shows line-by-line memory
# 3. Check memory_usage function python -c " from memory_profiler import memory_usage def test(): return [0] * 1000000 print(memory_usage((test,(),{}))) " // Returns memory usage list
# 4. Verify psutil works python -c "import psutil; print(psutil.Process().memory_info())" // Shows memory info
# 5. Test alternative (tracemalloc) python -c " import tracemalloc tracemalloc.start() x = [0] * 1000000 print(tracemalloc.take_snapshot().statistics('lineno')[0]) " // Shows memory allocation ```
Related Issues
- [Fix Python Memory Leak](/articles/fix-python-memory-leak)
- [Fix Python High Memory Usage](/articles/fix-python-high-memory-usage)
- [Fix Python GC Not Running](/articles/fix-python-gc-not-running)