Memory leaks are a silent threat that gradually degrades performance, leads to crashes, and increases operational costs. Unlike obvious bugs, memory leaks are often subtle and difficult to spot until they start causing serious problems.
Increased memory usage drives up server costs and negatively impacts user experience. Understanding how memory leaks occur is the first step in addressing them.
Understanding Memory Leaks
A memory leak happens when your application allocates memory and then fails to release it after it’s no longer needed. Over time, these unreleased memory blocks accumulate, leading to progressively higher memory consumption.
This is especially problematic in long-running processes like web servers, where the leak can cause the application to consume more and more memory until it eventually crashes or slows down to a crawl.
Understanding Memory Usage in Node.js (V8)
Node.js (V8) handles several distinct types of memory. Each plays a critical role in how your application performs and utilizes resources.
Memory Type | Description |
---|---|
RSS (Resident Set Size) | Total memory allocated for the Node.js process, including all parts of the memory: code, stack, and heap. |
Heap Total | Memory allocated for JavaScript objects. This is the total size of the allocated heap. |
Heap Used | Memory actually used by the JavaScript objects. This shows how much of the heap is currently in use. |
External | Memory used by C++ objects that are linked to JavaScript objects. This memory is managed outside the V8 heap. |
Array Buffers | Memory allocated for ArrayBuffer objects, which are used to hold raw binary data. |
- RSS (Resident Set Size): The total memory allocated for the process.
RSS refers to the total memory footprint of a Node.js process. It includes all memory allocated for the process, including the heap, stack, and code segments.
console.log('Initial Memory Usage:', process.memoryUsage());
setInterval(() => { const memoryUsage = process.memoryUsage(); console.log(`RSS: ${memoryUsage.rss}`);}, 1000);
This script logs the RSS memory usage every second. We can observe how the total memory footprint changes over time.
➜ node rss.jsInitial Memory Usage: { rss: 38502400, heapTotal: 4702208, heapUsed: 2559000, external: 1089863, arrayBuffers: 10515}RSS: 41025536RSS: 41041920RSS: 41041920RSS: 41041920
- Heap Total: The amount of memory allocated for the JavaScript objects.
Heap Total represents the total amount of memory allocated for the JavaScript objects by the V8 engine (the JavaScript engine used by Node.js).
console.log('Initial Memory Usage:', process.memoryUsage());
const largeArray = new Array(1e6).fill('A');
setInterval(() => { const memoryUsage = process.memoryUsage(); console.log(`Heap Total: ${memoryUsage.heapTotal}`);}, 1000);
Allocating a large array increases the heap total. The logged heap total shows the memory allocated for JavaScript objects.
➜ node heap.jsInitial Memory Usage: { rss: 38535168, heapTotal: 4702208, heapUsed: 2559224, external: 1089863, arrayBuffers: 10515}Heap Total: 12976128Heap Total: 12976128Heap Total: 12976128Heap Total: 12976128Heap Total: 12976128Heap Total: 12976128Heap Total: 12976128
- Heap Used: The amount of memory actually used by the objects.
Heap Used refers to the amount of memory that is currently being used by the JavaScript objects on the heap.
When we push objects into an array, we’re increasing the amount of memory used by the heap.
console.log('Initial Memory Usage:', process.memoryUsage());
let data = [];for (let i = 0; i < 1e6; i++) { data.push({ index: i });}
setInterval(() => { const memoryUsage = process.memoryUsage(); console.log(`Heap Used: ${memoryUsage.heapUsed}`);}, 1000);
The heap used value will rise as more objects are added.
➜ node heap-used.jsInitial Memory Usage: { rss: 38748160, heapTotal: 4702208, heapUsed: 2559424, external: 1089863, arrayBuffers: 10515}Heap Used: 2833808Heap Used: 2847776Heap Used: 2850800Heap Used: 2854352Heap Used: 2875800Heap Used: 2879488
- External: Memory used by C++ objects bound to JavaScript.
External memory refers to the memory used by C++ objects linked to JavaScript. These objects are created through bindings that let JavaScript interact with native code, allocating memory outside of the typical JavaScript heap.
This memory isn’t directly visible in JavaScript but still adds to the total memory used by the application.
The Buffer.alloc method allocates a 50MB buffer, which is tracked as external memory.
const buffer = Buffer.alloc(50 * 1024 * 1024); // Allocate 50MB of buffer
console.log('Initial Memory Usage:', process.memoryUsage());
setInterval(() => { const memoryUsage = process.memoryUsage(); console.log(`External Memory: ${memoryUsage.external}`);}, 1000);
This example logs the external memory usage, which will reflect the buffer allocation.
➜ node external.jsInitial Memory Usage: { rss: 39223296, heapTotal: 4702208, heapUsed: 2560832, external: 53518663, arrayBuffers: 52439315}External Memory: 53814435External Memory: 53814435External Memory: 53814435External Memory: 53814435External Memory: 53814435External Memory: 53814435External Memory: 53814435
- Array Buffers: Memory allocated for ArrayBuffer objects.
Array Buffers are memory used for ArrayBuffer objects. These objects store fixed-length binary data in JavaScript.
ArrayBuffer is part of JavaScript’s typed array system, letting you work with binary data directly.
The memory for these buffers is tracked separately from regular JavaScript objects. They’re often used for handling raw data, like files or network protocols.
Here’s an example where I allocate a 50MB ArrayBuffer and then check the initial memory usage of my Node.js process.
const buffer = new ArrayBuffer(50 * 1024 * 1024); // 50MB ArrayBuffer
console.log('Initial Memory Usage:', process.memoryUsage());
setInterval(() => { const memoryUsage = process.memoryUsage(); console.log(`Array Buffers: ${memoryUsage.arrayBuffers}`);}, 1000);
➜ node array-buffer.jsInitial Memory Usage: { rss: 39075840, heapTotal: 4702208, heapUsed: 2559496, external: 53518663, arrayBuffers: 52439315}Array Buffers: 52439315Array Buffers: 52439315Array Buffers: 52439315Array Buffers: 52439315Array Buffers: 52439315Array Buffers: 52439315
Common Causes of Memory Leaks in JavaScript
- Improperly Managed Variables
Variables that are not properly managed can cause memory leaks.
For instance, if you declare variables that are supposed to be temporary but forget to clean them up, they will continue to consume memory.
let cache = {};
function storeData(key, value) { cache[key] = value;}
// Simulating the function being called multiple timesstoreData('item1', new Array(1000000).fill('A'));storeData('item2', new Array(1000000).fill('B'));
// Memory leak: data stored in 'cache' is never released
In the example above, data is added to a global object called cache. If this data isn’t removed when it’s no longer needed, it will keep using memory unnecessarily.
This is especially problematic if these variables are stored in a global scope, making them persist throughout the application’s lifecycle.
let globalUserSessions = {}; // Global scope
function addUserSession(sessionId, userData) { globalUserSessions[sessionId] = userData; // Store user data in global scope}
function removeUserSession(sessionId) { delete globalUserSessions[sessionId]; // Manually remove user session}
// Simulate adding user sessionsaddUserSession('session1', { name: 'Alice', data: new Array(1000000).fill('A') });addUserSession('session2', { name: 'Bob', data: new Array(1000000).fill('B') });
// The globalUserSessions object will persist for the entire app lifecycle unless manually cleaned up
globalUserSessions is a global object used to store user session data. Because it’s in the global scope, it persists for the entire runtime of the application.
If sessions are not properly removed using removeUserSession, the data will remain in memory indefinitely, leading to a memory leak.
- Persistent Global Objects
Global objects can hold onto memory longer than needed. Data in them can stay in memory after it’s no longer needed. This gradually increases memory usage.
global.config = { settings: new Array(1000000).fill('Configuration')};// Memory leak: 'config' is global and remains in memory for the entire application lifecycle
Since config is globally accessible and never cleared, the memory it uses is retained for the entire runtime of the application. Here’s one way we can avoid the memory leak:
function createConfig() { return { settings: new Array(1000000).fill('Configuration') };}
// Use config only when needed, and let it be garbage collected afterwardsfunction processConfig() { const config = createConfig(); // Perform operations with config console.log(config.settings[0]);
// Config will be cleared from memory once it's no longer referenced}
processConfig();
Instead of storing config in a global object, we store config locally within a function. This ensures that config is cleared after the function runs, freeing up memory for garbage collection.
- Event Listeners Not Removed Adding event listeners without removing them properly when they are no longer needed can lead to memory leaks.
Each event listener retains a reference to the function and any variables it uses, preventing the garbage collector from reclaiming that memory.
Over time, if you keep adding listeners without removing them, this will result in increased memory usage.
Here’s an example that demonstrates how event listeners can cause memory leaks if not properly removed:
const EventEmitter = require('events');const myEmitter = new EventEmitter();
function listener() { console.log('Event triggered!');}
// Adding event listeners repeatedlysetInterval(() => { myEmitter.on('event', listener);}, 1000);
A new event listener is added every second. However, these listeners are never removed, which causes them to accumulate in memory.
Each listener holds a reference to the listener function and any associated variables, preventing garbage collection and leading to increased memory usage over time.
To prevent this memory leak, you should remove event listeners when they are no longer needed.
const EventEmitter = require('events');const myEmitter = new EventEmitter();
function listener() { console.log('Event triggered!');}
// Add an event listenermyEmitter.on('event', listener);
// Trigger the event and then remove the listenermyEmitter.emit('event');myEmitter.removeListener('event', listener);
// Alternatively, you can use `once` method to add a listener that automatically removes itself after being triggeredmyEmitter.once('event', listener);
- Closures Capturing Variables
Closures in JavaScript can unintentionally hold onto variables longer than needed. When a closure captures a variable, it keeps a reference to that memory.
If the closure is used in a long-running process or isn’t properly terminated, the captured variables stay in memory, causing a leak.
function createClosure() { let capturedVar = new Array(1000000).fill('Data');
return function() { console.log(capturedVar[0]); };}
const closure = createClosure();// The closure holds onto 'capturedVar', even if it's not used anymore.
To avoid leaks, ensure closures don’t unnecessarily capture large variables or end them when no longer needed.
function createClosure() { let capturedVar = new Array(1000000).fill('Data');
return function() { console.log(capturedVar[0]); capturedVar = null; // Release memory when no longer needed };}
const closure = createClosure();closure(); // 'capturedVar' is released after use.
- Unmanaged Callbacks
In certain scenarios, unmanaged callbacks can cause memory issues if they hold onto variables or objects longer than necessary.
However, JavaScript’s garbage collector is generally effective at cleaning up memory once references are no longer needed.
function fetchData(callback) { let data = new Array(1000000).fill('Data');
setTimeout(() => { callback(data); }, 1000);}
function handleData(data) { console.log(data[0]);}
fetchData(handleData); // The 'data' array remains in memory.
In the example above:
- Data Allocation: The
fetchData
function allocates a large array (data), which holds 1 million elements. - Callback Reference: The callback function
handleData
references this large array when it’s invoked by setTimeout after 1 second.
Despite the large allocation, JavaScript’s garbage collector ensures that memory is released when no longer needed.
There is no need to manually clear the references unless you are dealing with very complex scenarios where references are unintentionally retained.
- Overly Complex (Not Recommended)
function fetchData(callback) { let data = new Array(1000000).fill('Data');
setTimeout(() => { callback(data); data = null; // Release the reference global.gc(); // Explicitly trigger garbage collection }, 1000);}
function handleData(data) { console.log(data[0]); data = null; // Clear reference after handling}
console.log('Initial Memory Usage:', process.memoryUsage());
fetchData(handleData);
setTimeout(() => { console.log('Final Memory Usage:', process.memoryUsage());}, 2000); // Give some time for garbage collection
While this code manually clears references and explicitly triggers garbage collection, it introduces unnecessary complexity.
JavaScript’s garbage collector is typically sufficient for handling memory cleanup without these additional steps.
In most scenarios, such manual interventions are not only redundant but can also make the code harder to maintain.
- Incorrect Use of
bind()
Using bind()
creates a new function with its this
keyword set to a specific value. If you’re not careful, this can cause memory leaks.
function MyClass() { this.largeData = new Array(1000000).fill('leak');
window.addEventListener('click', this.handleClick.bind(this));}
MyClass.prototype.handleClick = function() { console.log('Clicked');};
// If MyClass instance is destroyed, but the event listener is not removed,// the bound function will keep the instance alive in memory.
- Why Memory Leaks Happen with
bind()
1. References are Kept: When you use bind()
, the new function remembers the original function and this value. If you don’t remove the function when it’s no longer needed, it sticks around and uses memory.
2. Big Objects Stay in Memory: Bound functions can accidentally keep large objects in memory, even if you don’t need them anymore.
- Circular References
Circular references happen when two objects refer to each other. This creates a loop that can confuse the garbage collector, preventing it from freeing up memory.
function CircularReference() { this.reference = this; // Circular reference}
let obj = new CircularReference();obj = null; // Setting obj to null may not free the memory.
Even if you set obj to null, the memory might not be released because of the self-loop. Here’s how you can avoid Circular Reference.
- Break the Loop: Make sure objects don’t refer back to each other when they are no longer needed. This helps the garbage collector clear them out.
function CircularReference() { this.reference = this;}
let obj = new CircularReference();
// Breaking the circular referenceobj.reference = null;obj = null; // Now the memory can be freed
By setting obj.reference
to null
, we break the circular reference. This allows the garbage collector to free up the memory when obj
is no longer needed.
- Use Weak References: Using
WeakMap
,WeakSet
, orWeakRef
allows the garbage collector to clean up memory even if there are references, as long as they are weak.
let weakMap = new WeakMap();
function CircularReference() { let obj = {}; weakMap.set(obj, "This is a weak reference"); return obj;}
let obj = CircularReference();// The object can be garbage collected when no longer needed
weakMap
holds a weak reference to obj
. This means that when obj
is no longer used elsewhere, it can still be garbage collected even though it’s referenced in weakMap
.
let weakRef;
function createObject() { let obj = { data: 'important' }; weakRef = new WeakRef(obj); return obj;}
let obj = createObject();
console.log(weakRef.deref()); // { data: 'important' }
obj = null; // Now the object can be garbage collected
weakRef
allows you to hold a weak reference to obj
. If obj
is set to null and there are no other references to it, it can be garbage collected, even though weakRef
still exists.
Quick Note
WeakMap
, WeakSet
, and WeakRef
are great for preventing memory leaks, but you might not need them all the time. They’re more for advanced use cases, like managing caches or big data.
If you’re working on typical web apps, you might not see them often, but it’s good to know they exist when you need them.
Profiling Memory Usage in Node.js
To find memory leaks, you need to profile your application to understand how memory is being used.
Here’s a Node.js application designed to simulate CPU-intensive tasks, I/O operations, and intentionally create a memory leak for testing purposes.
const http = require('http');const url = require('url');
// Simulate a CPU-intensive taskconst handleCpuIntensiveTask = (req, res) => { let result = 0; for (let i = 0; i < 1e7; i++) { result += i * Math.random(); } console.log('Memory Usage (CPU Task):', process.memoryUsage()); // Log memory usage res.writeHead(200, { 'Content-Type': 'text/plain' }); res.end(`Result of the CPU-intensive task: ${result}`);};
// Create a large in-memory bufferconst largeBuffer = Buffer.alloc(1024 * 1024 * 50, 'a'); // 50MB buffer filled with 'a'
// Simulate an I/O operationconst handleSimulateIo = (req, res) => { // Simulate reading the buffer as if it were a file setTimeout(() => { console.log('Memory Usage (Simulate I/O):', process.memoryUsage()); // Log memory usage res.writeHead(200, { 'Content-Type': 'text/plain' }); res.end(`Simulated I/O operation completed with data of length: ${largeBuffer.length}`); }, 500); // Simulate a 500ms I/O operation};
// Simulate a memory leak (For Testing)let memoryLeakArray = [];
const causeMemoryLeak = () => { memoryLeakArray.push(new Array(1000).fill('memory leak')); console.log('Memory leak array length:', memoryLeakArray.length);};
const server = http.createServer((req, res) => { const parsedUrl = url.parse(req.url, true);
if (parsedUrl.pathname === '/cpu-intensive') { handleCpuIntensiveTask(req, res); } else if (parsedUrl.pathname === '/simulate-io') { handleSimulateIo(req, res); } else if (parsedUrl.pathname === '/cause-memory-leak') { causeMemoryLeak(); res.writeHead(200, { 'Content-Type': 'text/plain' }); res.end('Memory leak caused. Check memory usage.'); } else { res.writeHead(404, { 'Content-Type': 'text/plain' }); res.end('Not Found'); }});
const PORT = process.env.PORT || 3000;server.listen(PORT, () => { console.log(`Server is running on port ${PORT}`);});
Next, we need to stress test our server. This script stress tests the server by sending 100 requests each to simulate CPU, I/O, and memory leaks.
#!/bin/bash
# Number of requests to sendREQUESTS=100
# Endpoint URLsCPU_INTENSIVE_URL="http://localhost:3000/cpu-intensive"SIMULATE_IO_URL="http://localhost:3000/simulate-io"MEMORY_LEAK_URL="http://localhost:3000/cause-memory-leak"
echo "Sending $REQUESTS requests to $CPU_INTENSIVE_URL and $SIMULATE_IO_URL..."
# Loop for CPU-intensive endpointfor ((i=1;i<=REQUESTS;i++)); do curl -s $CPU_INTENSIVE_URL > /dev/null &done
# Loop for Simulated I/O endpointfor ((i=1;i<=REQUESTS;i++)); do curl -s $SIMULATE_IO_URL > /dev/null &done
# Loop for Memory Leak endpointfor ((i=1;i<=REQUESTS;i++)); do curl -s $MEMORY_LEAK_URL > /dev/null &done
waitecho "Done."
It loops through the URLs and sends silent requests using curl, running them in the background to simulate a high load.
➜ ./load_test.shSending 100 requests to http://localhost:3000/cpu-intensive and http://localhost:3000/simulate-io and http://localhost:3000/cause-memory-leakDone.
Here’s how our server responds to the stress test. Make sure the server is running before you start the test.
➜ node --prof server.jsServer is running on port 3000Memory Usage (Simulate I/O): { rss: 122863616, heapTotal: 17547264, heapUsed: 8668016, external: 54075004, arrayBuffers: 52439275}Memory leak array length: 25Memory leak array length: 26Memory leak array length: 27Memory leak array length: 28Memory leak array length: 29Memory leak array length: 30Memory leak array length: 31Memory leak array length: 32Memory leak array length: 33Memory leak array length: 34Memory leak array length: 35Memory leak array length: 36Memory leak array length: 37Memory leak array length: 38Memory leak array length: 39Memory leak array length: 40Memory leak array length: 41Memory leak array length: 42Memory leak array length: 43Memory leak array length: 44Memory leak array length: 45Memory leak array length: 46Memory leak array length: 47Memory leak array length: 48Memory leak array length: 49Memory leak array length: 50Memory leak array length: 51Memory leak array length: 52Memory leak array length: 53Memory leak array length: 54Memory leak array length: 55Memory leak array length: 56Memory Usage (CPU Task): { rss: 122716160, heapTotal: 17547264, heapUsed: 11393456, external: 54075004, arrayBuffers: 52439275}Memory leak array length: 173
Analysing the Results
The profiling data will be saved in a file with a name like isolate-0xXXXXXXXXXXXX-v8.log.
To process the log and get a human-readable summary, run:
➜ node --prof-process isolate-0x140008000-42065-v8.log > processed-profile.txt
This will generate a processed-profile.txt file with the CPU profiling data, which includes details about where your application spent time and how it managed memory.
Open the processed-profile.txt file and look for areas where a significant amount of time or memory is being used.
Statistical profiling result from isolate-0x140008000-42065-v8.log, (4099 ticks, 308 unaccounted, 0 excluded).
[Shared libraries]: ticks total nonlib name
[JavaScript]: ticks total nonlib name 1007 24.6% 24.6% JS: *handleCpuIntensiveTask /Users/trevorindreklasn/Projects/labs/node-memory/server.js:5:32 5 0.1% 0.1% JS: +handleCpuIntensiveTask /Users/trevorindreklasn/Projects/labs/node-memory/server.js:5:32 1 0.0% 0.0% JS: ^onParserExecute node:_http_server:839:25 1 0.0% 0.0% JS: ^getKeys node:internal/util/inspect:709:17 1 0.0% 0.0% JS: ^clearBuffer node:internal/streams/writable:742:21 1 0.0% 0.0% JS: ^checkListener node:events:276:23 1 0.0% 0.0% JS: ^Socket node:net:353:16 1 0.0% 0.0% JS: +pushAsyncContext node:internal/async_hooks:539:26 1 0.0% 0.0% JS: +processTicksAndRejections node:internal/process/task_queues:67:35
[C++]: ticks total nonlib name 2772 67.6% 67.6% t std::__1::__hash_table<std::__1::__hash_value_type<int, std::__1::unique_ptr<std::__1::unordered_map<int, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>>>>, std::__1::default_delete<std::__1::unordered_map<int, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>>>>>>>, std::__1::__unordered_map_hasher<int, std::__1::__hash_value_type<int, std::__1::unique_ptr<std::__1::unordered_map<int, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>>>>, std::__1::default_delete<std::__1::unordered_map<int, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>>>>>>>, std::__1::hash<int>, std::__1::equal_to<int>, true>, std::__1::__unordered_map_equal<int, std::__1::__hash_value_type<int, std::__1::unique_ptr<std::__1::unordered_map<int, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>>>>, std::__1::default_delete<std::__1::unordered_map<int, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>>>>>>>, std::__1::equal_to<int>, std::__1::hash<int>, true>, std::__1::allocator<std::__1::__hash_value_type<int, std::__1::unique_ptr<std::__1::unordered_map<int, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>>>>, std::__1::default_delete<std::__1::unordered_map<int, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>>>>>>>>>::rehash(unsigned long)
[Summary]: ticks total nonlib name 1019 24.9% 24.9% JavaScript 2772 67.6% 67.6% C++ 358 8.7% 8.7% GC 0 0.0% Shared libraries 308 7.5% Unaccounted
[C++ entry points]: ticks cpp total name 2636 100.0% 64.3% TOTAL
[Bottom up (heavy) profile]: Note: percentage shows a share of a particular caller in the total amount of its parent calls. Callers occupying less than 1.0% are not shown.
ticks parent name 2772 67.6% t std::__1::__hash_table<std::__1::__hash_value_type<int, std::__1::unique_ptr<std::__1::unordered_map<int, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>>>>, std::__1::default_delete<std::__1::unordered_map<int, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>>>>>>>, std::__1::__unordered_map_hasher<int, std::__1::__hash_value_type<int, std::__1::unique_ptr<std::__1::unordered_map<int, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>>>>, std::__1::default_delete<std::__1::unordered_map<int, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>>>>>>>, std::__1::hash<int>, std::__1::equal_to<int>, true>, std::__1::__unordered_map_equal<int, std::__1::__hash_value_type<int, std::__1::unique_ptr<std::__1::unordered_map<int, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>>>>, std::__1::default_delete<std::__1::unordered_map<int, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>>>>>>>, std::__1::equal_to<int>, std::__1::hash<int>, true>, std::__1::allocator<std::__1::__hash_value_type<int, std::__1::unique_ptr<std::__1::unordered_map<int, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>>>>, std::__1::default_delete<std::__1::unordered_map<int, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>>>>>>>>>::rehash(unsigned long) 1880 67.8% JS: *handleCpuIntensiveTask /Users/trevorindreklasn/Projects/labs/node-memory/server.js:5:32 1727 91.9% JS: ^<anonymous> /Users/trevorindreklasn/Projects/labs/node-memory/server.js:36:34 1129 65.4% JS: +emit node:events:467:44 1129 100.0% JS: ^parserOnIncoming node:_http_server:1033:26 1129 100.0% JS: ^parserOnHeadersComplete node:_http_common:71:33 598 34.6% JS: ^emit node:events:467:44 598 100.0% JS: ^parserOnIncoming node:_http_server:1033:26 598 100.0% JS: ^parserOnHeadersComplete node:_http_common:71:33 153 8.1% JS: ~<anonymous> /Users/trevorindreklasn/Projects/labs/node-memory/server.js:36:34 140 91.5% JS: ^emit node:events:467:44 140 100.0% JS: ~parserOnIncoming node:_http_server:1033:26 140 100.0% JS: ~parserOnHeadersComplete node:_http_common:71:33 13 8.5% JS: ~parserOnIncoming node:_http_server:1033:26 13 100.0% JS: ~parserOnHeadersComplete node:_http_common:71:33 655 23.6% t std::__1::__hash_table<std::__1::__hash_value_type<int, std::__1::unique_ptr<std::__1::unordered_map<int, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>>>>, std::__1::default_delete<std::__1::unordered_map<int, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>>>>>>>, std::__1::__unordered_map_hasher<int, std::__1::__hash_value_type<int, std::__1::unique_ptr<std::__1::unordered_map<int, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>>>>, std::__1::default_delete<std::__1::unordered_map<int, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>>>>>>>, std::__1::hash<int>, std::__1::equal_to<int>, true>, std::__1::__unordered_map_equal<int, std::__1::__hash_value_type<int, std::__1::unique_ptr<std::__1::unordered_map<int, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>>>>, std::__1::default_delete<std::__1::unordered_map<int, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>>>>>>>, std::__1::equal_to<int>, std::__1::hash<int>, true>, std::__1::allocator<std::__1::__hash_value_type<int, std::__1::unique_ptr<std::__1::unordered_map<int, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>>>>, std::__1::default_delete<std::__1::unordered_map<int, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>, std::__1::hash<int>, std::__1::equal_to<int>, std::__1::allocator<std::__1::pair<int const, std::__1::unique_ptr<v8_inspector::InspectedContext, std::__1::default_delete<v8_inspector::InspectedContext>>>>>>>>>>::rehash(unsigned long) 654 99.8% JS: *handleCpuIntensiveTask /Users/trevorindreklasn/Projects/labs/node-memory/server.js:5:32 612 93.6% JS: ^<anonymous> /Users/trevorindreklasn/Projects/labs/node-memory/server.js:36:34 410 67.0% JS: +emit node:events:467:44 410 100.0% JS: ^parserOnIncoming node:_http_server:1033:26 202 33.0% JS: ^emit node:events:467:44 202 100.0% JS: ^parserOnIncoming node:_http_server:1033:26 42 6.4% JS: ~<anonymous> /Users/trevorindreklasn/Projects/labs/node-memory/server.js:36:34 40 95.2% JS: ^emit node:events:467:44 40 100.0% JS: ~parserOnIncoming node:_http_server:1033:26 2 4.8% JS: ~parserOnIncoming node:_http_server:1033:26 2 100.0% JS: ~parserOnHeadersComplete node:_http_common:71:33 49 1.8% JS: ^<anonymous> /Users/trevorindreklasn/Projects/labs/node-memory/server.js:36:34 38 77.6% JS: +emit node:events:467:44 38 100.0% JS: ^parserOnIncoming node:_http_server:1033:26 38 100.0% JS: ^parserOnHeadersComplete node:_http_common:71:33 11 22.4% JS: ^emit node:events:467:44 11 100.0% JS: ^parserOnIncoming node:_http_server:1033:26 11 100.0% JS: ^parserOnHeadersComplete node:_http_common:71:33
1007 24.6% JS: *handleCpuIntensiveTask /Users/trevorindreklasn/Projects/labs/node-memory/server.js:5:32 940 93.3% JS: ^<anonymous> /Users/trevorindreklasn/Projects/labs/node-memory/server.js:36:34 663 70.5% JS: +emit node:events:467:44 663 100.0% JS: ^parserOnIncoming node:_http_server:1033:26 663 100.0% JS: ^parserOnHeadersComplete node:_http_common:71:33 277 29.5% JS: ^emit node:events:467:44 277 100.0% JS: ^parserOnIncoming node:_http_server:1033:26 277 100.0% JS: ^parserOnHeadersComplete node:_http_common:71:33 67 6.7% JS: ~<anonymous> /Users/trevorindreklasn/Projects/labs/node-memory/server.js:36:34 61 91.0% JS: ^emit node:events:467:44 61 100.0% JS: ~parserOnIncoming node:_http_server:1033:26 61 100.0% JS: ~parserOnHeadersComplete node:_http_common:71:33 6 9.0% JS: ~parserOnIncoming node:_http_server:1033:26 6 100.0% JS: ~parserOnHeadersComplete node:_http_common:71:33
308 7.5% UNKNOWN 11 3.6% JS: ^compileForInternalLoader node:internal/bootstrap/realm:384:27 11 100.0% JS: ^requireBuiltin node:internal/bootstrap/realm:421:24 2 18.2% JS: ~<anonymous> node:internal/streams/duplex:1:1 2 100.0% JS: ^compileForInternalLoader node:internal/bootstrap/realm:384:27 2 100.0% JS: ^requireBuiltin node:internal/bootstrap/realm:421:24 2 18.2% JS: ~<anonymous> node:http:1:1 2 100.0% JS: ^compileForInternalLoader node:internal/bootstrap/realm:384:27 2 100.0% JS: ~compileForPublicLoader node:internal/bootstrap/realm:332:25 1 9.1% JS: ~<anonymous> node:net:1:1 1 100.0% JS: ^compileForInternalLoader node:internal/bootstrap/realm:384:27 1 100.0% JS: ^requireBuiltin node:internal/bootstrap/realm:421:24 1 9.1% JS: ~<anonymous> node:internal/streams/readable:1:1 1 100.0% JS: ^compileForInternalLoader node:internal/bootstrap/realm:384:27 1 100.0% JS: ^requireBuiltin node:internal/bootstrap/realm:421:24 1 9.1% JS: ~<anonymous> node:internal/streams/operators:1:1 1 100.0% JS: ^compileForInternalLoader node:internal/bootstrap/realm:384:27 1 100.0% JS: ^requireBuiltin node:internal/bootstrap/realm:421:24 1 9.1% JS: ~<anonymous> node:internal/perf/observe:1:1 1 100.0% JS: ^compileForInternalLoader node:internal/bootstrap/realm:384:27 1 100.0% JS: ^requireBuiltin node:internal/bootstrap/realm:421:24 1 9.1% JS: ~<anonymous> node:internal/child_process:1:1 1 100.0% JS: ^compileForInternalLoader node:internal/bootstrap/realm:384:27 1 100.0% JS: ^requireBuiltin node:internal/bootstrap/realm:421:24 1 9.1% JS: ~<anonymous> node:child_process:1:1 1 100.0% JS: ^compileForInternalLoader node:internal/bootstrap/realm:384:27 1 100.0% JS: ^requireBuiltin node:internal/bootstrap/realm:421:24 1 9.1% JS: ~<anonymous> node:_http_agent:1:1 1 100.0% JS: ^compileForInternalLoader node:internal/bootstrap/realm:384:27 1 100.0% JS: ^requireBuiltin node:internal/bootstrap/realm:421:24
Pay particular attention to:
- High CPU usage functions: These are the bottlenecks in your code.
- Memory-intensive functions: Functions that consume large amounts of memory might point to potential memory leaks, especially if they correspond to parts of your code that are supposed to release memory but don’t.
- Event Loop and Garbage Collection (GC): Look for a high percentage of time spent in GC, as this might suggest that the application is struggling with memory management.
Memory leaks may be subtle, but addressing them is key to keeping your JavaScript applications efficient and reliable.