Speed isn’t the only factor that makes a JavaScript runtime great, but it’s a good place to start.
A runtime is essentially the environment that executes your code, handling things like memory management, i/o operations, and more.
A runtime is the engine under the hood of your JavaScript applications.
Runtime | Engine | Engine Language (Built in) | Runtime Language (Built In) | First Release |
---|---|---|---|---|
Node.js | V8 | C++ | C++ | 2009 |
Deno | V8 | C++ | Rust | 2020 |
Bun | JavaScriptCore | C++ | Zig | 2022 |
The terms “engine” and “runtime” refer to different components that work together to execute JavaScript code.
Engine
The engine is the core component that interprets or compiles JavaScript code into machine code that the computer can execute. It is responsible for understanding and executing the JavaScript syntax and features.
Engines are designed to optimize code execution and manage memory efficiently.
- V8: An open-source JavaScript engine developed by Google, written in C++. It’s used in both Node.js and Deno, as well as in Google Chrome.
- JavaScriptCore: Also known as “JSC,” this is the engine developed by Apple, written in C++, and used in Bun and Safari.
Runtime
The runtime is the environment that provides additional functionality beyond what the engine offers. That includes APIs and libraries that enable interactions with the operating system, manage i/o operations, handle asynchronous tasks, and more.
The runtime essentially wraps around the engine and extends its capabilities.
- Node.js: Built in C++ and JavaScript, it provides a rich set of APIs for server-side development, such as file system access, HTTP servers, and more.
- Deno.js: Developed in Rust, Deno offers secure-by-default features, native TypeScript support, and a standard library.
- Bun.js: Written in Zig, Bun focuses on performance and includes a fast JavaScript/TypeScript runtime and bundler.
How They Work Together
Engine + Runtime: The engine parses and executes the JavaScript code, while the runtime provides the necessary environment and tools for the code to interact with the system and perform useful tasks.
For example, in Node.js, the V8 engine handles the JavaScript execution, while the Node.js runtime provides modules like fs (for file system operations) and http (for building servers).
Similarly, in Deno, the V8 engine runs the JavaScript, and the Deno runtime offers features like permissions and a standard library for common tasks.
Bun uses JavaScriptCore for execution and its runtime for high-performance utilities.
In a nutshell, the engine is the core JavaScript interpreter or compiler. The runtime is the broader environment that includes tools, libraries, and APIs for building and running applications.
Both are essential for the complete execution of JavaScript code in different contexts.
Importance of Speed in JavaScript Runtimes
-
User Experience: Faster response times result in a better user experience. Applications that run quickly are seen as more responsive and reliable. This can increase user satisfaction and retention.
-
Scalability: Higher speed often correlates with lower resource usage. This efficiency enables servers to handle more concurrent users or requests, enhancing scalability and cost-effectiveness, particularly for high-traffic applications.
-
Developer Productivity: Faster runtimes shorten development time by speeding up testing and debugging phases. Quicker feedback loops enhance the developer experience and accelerate the development process.
-
Cost Efficiency: Efficient resource utilization lowers operational costs. By reducing CPU and memory usage, you can achieve significant savings, especially as your application scales.
Benchmarking Runtimes Using AutoCannon
AutoCannon is an HTTP/1.1 benchmarking tool. It excels at assessing the performance of HTTP/1.1 servers by focusing on key metrics like raw throughput and latency.
I simulated 100 concurrent connections for 30 seconds. I measured how well each runtime handles requests and data under heavy load.
Hardware Overview
Key Metrics
- Requests per Second (Req/Sec): Indicates the number of HTTP requests the server can handle per second. A higher value means better handling of concurrent traffic.
- Average Latency: Measures the time taken for a request to travel to the server and back. Lower latency means faster response times.
- Throughput: Represents the amount of data transferred between the server and clients per second. Higher throughput indicates the server’s efficiency in handling data.
Understanding the Metrics (example)
The table below provides a snapshot of the key performance metrics observed during our benchmarking tests. Each metric helps illustrate how well the servers perform under load, offering insights into their responsiveness and capacity.
Metric | 2.5% | 50% | 97.5% | 99% | Avg | Stdev | Max |
---|---|---|---|---|---|---|---|
Latency (ms) | 0 | 0 | 1 | 1 | 0.1 | 0.34 | 19 |
Req/Sec | 100,479 | 100,479 | 106,367 | 108,607 | 106,182.4 | 1,948.84 | 100,454 |
Bytes/Sec (MB) | 18.5 | 18.5 | 19.6 | 20 | 19.5 | 357 | 18.5 |
Breakdown of Metrics
-
Percentiles (2.5%, 50%, 97.5%, 99%): These values show the latency experienced by the top percentages of requests. For example, the 2.5% value means 2.5% of the requests had a latency equal to or lower than that number, highlighting the best response times. The 50% (median) indicates the typical latency, while 97.5% and 99% values show the upper range of latencies, reflecting slower responses under load.
-
Avg (Average): This represents the mean value of all data points, such as latency or request rate. It provides a general sense of the runtime’s performance, showing how quickly the server responds on average.
-
Stdev (Standard Deviation): This metric measures the variability in the data points. A low standard deviation suggests consistent performance, while a high standard deviation indicates more variability and potential instability.
-
Max: The maximum observed value, representing the worst-case scenario. It shows the highest latency or the most substantial deviation from the average performance, crucial for understanding the extremes of system performance.
These metrics provide a clear picture of each server’s efficiency in handling concurrent requests and data. They allow us to assess the stability, speed, and overall reliability of the runtimes.
Node.js (v22.5.1)
Metric | 2.5% | 50% | 97.5% | 99% | Avg | Stdev | Max |
---|---|---|---|---|---|---|---|
Latency (ms) | 0 | 0 | 1 | 1 | 0.1 | 0.34 | 19 |
Req/Sec | 100,479 | 100,479 | 106,367 | 108,607 | 106,182.4 | 1,948.84 | 100,454 |
Bytes/Sec (MB) | 18.5 | 18.5 | 19.6 | 20 | 19.5 | 357 | 18.5 |
Node.js handled an average of 106,182.4 requests per second with an average latency of 0.1 ms. The maximum latency reached was 19 ms, indicating some variability under load but generally low. The throughput was 586 MB.
Bun.js (v1.1.21)
Metric | 2.5% | 50% | 97.5% | 99% | Avg | Stdev | Max |
---|---|---|---|---|---|---|---|
Latency (ms) | 0 | 0 | 0 | 1 | 0.02 | 0.16 | 16 |
Req/Sec | 128,639 | 128,639 | 132,095 | 135,423 | 132,417.07 | 1,567.99 | 128,591 |
Bytes/Sec (MB) | 16.5 | 16.5 | 16.9 | 17.3 | 16.9 | 201 | 16.5 |
Bun.js showed 132,417.07 requests per second with an impressive average latency of just 0.02 ms. The maximum latency observed was 16 ms. Bun.js also had a throughput of 508 MB, slightly lower than Node.js.
Deno (v1.45.5)
Metric | 2.5% | 50% | 97.5% | 99% | Avg | Stdev | Max |
---|---|---|---|---|---|---|---|
Latency (ms) | 0 | 0 | 1 | 1 | 0.04 | 0.23 | 20 |
Req/Sec | 141,439 | 141,439 | 148,991 | 149,887 | 148,309.34 | 1,748.84 | 141,387 |
Bytes/Sec (MB) | 21.8 | 21.8 | 23 | 23.1 | 22.8 | 271 | 21.8 |
Deno led the pack with 148,309.34 requests per second and an average latency of 0.04 ms. The maximum latency reached 20 ms, showing good consistency. Deno achieved the highest throughput among the three, at 685 MB.
Results
Latency: Lower latency means faster response times. Bun.js and Deno had lower average latencies compared to Node.js, making them more responsive under the same load.
Requests per Second: This metric indicates how many requests the server can handle per second. Deno outperformed both Node.js and Bun.js, making it the most capable in handling large numbers of concurrent requests.
Throughput: Throughput measures the amount of data processed per second. Deno also excelled here, handling the highest amount of data.
Conclusion
Deno stood out as the fastest among the three, followed by Bun.js, with Node.js coming in third.
Benchmarking Runtimes Using Artillery
Artillery is designed for sophisticated load testing, including testing APIs, microservices, and websites. It’s used for simulating complex, real-world scenarios with multiple stages and user interactions.
Node.js
Deno.js
Bun.js
Breakdown Of Metrics
- HTTP Codes: The number of successful HTTP responses (200 OK) received.
- Downloaded Bytes: The total amount of data downloaded during the test.
- Request Rate: The number of HTTP requests sent per second.
- Response Time: The time taken for a server to respond to a request, including metrics like min, max, mean, median, p95 * (95th percentile), and p99 (99th percentile).
- Virtual Users (VUs): The number of simulated users interacting with the server, including those completed, created, failed, and their session lengths.
Results
In the Artillery benchmark tests, we evaluated Node.js, Deno, and Bun.js by simulating real-world scenarios with a significant number of virtual users (VUs).
Node.js:
- HTTP Codes 200: Successfully handled 9000 requests.
- Request Rate: Maintained an average of 140 requests per second.
- Response Time: The mean response time was low at 0.1 ms, with a maximum of 15 ms.
- Virtual Users Completed: 3000 VUs completed their sessions, with session lengths ranging from 1.1 to 17.6 seconds.
Deno.js:
- HTTP Codes 200: Handled 9000 successful responses.
- Request Rate: Averaged around 150 requests per second.
- Response Time: Consistently low response times, with a mean of 0.1 ms and a maximum of 9 ms.
- Virtual Users Completed: 3000 VUs completed their sessions, with session lengths between 1.1 and 20.4 seconds.
Bun.js:
- HTTP Codes 200: Also handled 9000 requests successfully.
- Request Rate: Similar to Deno, with 150 requests per second.
- Response Time: Very low mean response time of 0.1 ms, with a maximum of 13 ms.
- Virtual Users Completed: 3000 VUs completed their sessions, with session lengths from 1 to 18.4 seconds.
Overall, all three runtimes handled high concurrent traffic excellently with minimal response times.
Deno and Bun.js had slightly higher request rates than Node.js.
All three maintained low response times, showing efficient performance under load.
The consistent handling of 3000 virtual users across all tests demonstrates the stability and reliability of each runtime in real-world scenarios.
Final Conclusion
Based on the benchmarks, Deno appears to be the fastest runtime. It consistently demonstrated the highest request rates, lowest response times, and handled the largest amount of data throughput.
While Deno excels in raw performance, the choice of runtime should also consider factors like ecosystem maturity, community support, and ease of use.
However, it’s important to note that software is constantly evolving with updates and optimizations. The performance differences are slight, making it a difficult decision.
Each runtime has its strengths, and the best choice depends on your specific project requirements and context.