I'm puzzled by the title of this post. From what I can gather most, if not all, of the performance improvements came from sacking SQLite and Zod.
They applied optimizations that cut CPU time by ~40% to the Bun version before comparing it with Node. Claiming 5x throughput from "replacing Node.js with Bun" is a wild misrepresentation of the findings.
And they include "phase 3 opts" in the phase2 benchmark, so the move to Bun also includes improvements from removing "safeParse". So Node might've been at more than 40% of the performance.
It's sad since these kinds of numbers are interesting, but when there's blatant misrepresentations it just create a stink.
The SQL query they replaced was extremely cringe and amateurish ("let's sprinkle DISTINCT until all those pesky redundant rows that come from our inefficient KV metadata schema go away"). The fact they did not acknowledge that and somehow blame it on SQlite made me stop reading on the spot, and be very worried for whoever depends on their products.
I was curious why bun build --compile would be faster. The docs say:
“Compiled executables reduce memory usage and improve Bun’s start time.
Normally, Bun reads and transpiles JavaScript and TypeScript files on import and require. This is part of what makes so much of Bun “just work”, but it’s not free. It costs time and memory to read files from disk, resolve file paths, parse, transpile, and print source code.
With compiled executables, you can move that cost from runtime to build-time.”
>Next: the runtime itself. Bun has a bun build --compile flag that produces a single self-contained executable. No runtime, no node_modules, no source files needed in the container.
I didn't know that. So Bun is basically a whole runtime + framework all in one with little to no deployment headaches?
The bun build creates a large self-contained executable with no optimisations. Almost like a large electron build.
Deno also provides the same functionality, but with a smaller optimized binary.
Appreciate Bun helping creating healthy competition. I feel like Deno falls under most people's radar often. More security options, faster than Node, built on web standards.
Ideally we would still only use JavaScript on the browser, personally I don't care about about the healthy competition, rather that npm actually works when I am stuck writing server side code I didn't ask for.
FE-BE standardization is efficient in terms of labor and code migration portability, but I really like the idea of static compilation and optimization of the BE in production.. there's absolutely no need or reason for the BE to do dynamic anything in prod. As long as it retains profiling inspectability when things go wrong.
That doesn’t align with my experience. It feels more like a trojan horse. Client and Server rarely (should) share code, and people that are really good at one discipline aren’t that good at the other. Maybe LLMs will change that.
Except we have moved beyond that with SaaS products,agents, AI workflows.
The only reason I touch JavaScript on the backend instead of .NET, Java, Go, Rust, OCaml, Haskell,.... are SDKs and customers that don't let other option than JavaScript all over the place.
Thus my point of view is that I couldn't care less about competition between JavaScript runtimes.
SEA with node.js "works" for nearly arbitrarily general node code -- pretty much anything you can run with node. However you may have to put in substantial extra effort, e.g., using [1], and possibly more work (e.g., copying assets out or using a virtual file system).
I wish you were getting replies instead of downvotes. I want to know why people think Bun is preferable here. For cross-platform non-performance-important code, I’ll use Bun all day. Once speed enters the equation, I don’t see why you’d still be using it.
I use bun for everything except for monorepos with isolated deployment targets and shared packages. I use yarn or pnpm for monorepos. Maybe it's changed in the last six months but I could never get docker to properly resolve my dependencies when I only want to build the web app, for example, since the bun lock is deterministic based off of all the packages in the repo so isolating a single leaf makes it error.
Maybe I'm doing something wrong but I scoured docs and online and asked multiple AI agents to no avail.
I'm puzzled by the title of this post. From what I can gather most, if not all, of the performance improvements came from sacking SQLite and Zod.
They applied optimizations that cut CPU time by ~40% to the Bun version before comparing it with Node. Claiming 5x throughput from "replacing Node.js with Bun" is a wild misrepresentation of the findings.
Don’t let facts get in the way of a catchy headline
:love
And they include "phase 3 opts" in the phase2 benchmark, so the move to Bun also includes improvements from removing "safeParse". So Node might've been at more than 40% of the performance.
It's sad since these kinds of numbers are interesting, but when there's blatant misrepresentations it just create a stink.
The SQL query they replaced was extremely cringe and amateurish ("let's sprinkle DISTINCT until all those pesky redundant rows that come from our inefficient KV metadata schema go away"). The fact they did not acknowledge that and somehow blame it on SQlite made me stop reading on the spot, and be very worried for whoever depends on their products.
I was curious why bun build --compile would be faster. The docs say:
“Compiled executables reduce memory usage and improve Bun’s start time.
Normally, Bun reads and transpiles JavaScript and TypeScript files on import and require. This is part of what makes so much of Bun “just work”, but it’s not free. It costs time and memory to read files from disk, resolve file paths, parse, transpile, and print source code.
With compiled executables, you can move that cost from runtime to build-time.”
https://bun.com/docs/bundler/executables#deploying-to-produc...
It's not about Bun, but more about sqlite and zod replacements. Why interpret this as "Bun is faster"?
Gets more reactions that way
>Next: the runtime itself. Bun has a bun build --compile flag that produces a single self-contained executable. No runtime, no node_modules, no source files needed in the container.
I didn't know that. So Bun is basically a whole runtime + framework all in one with little to no deployment headaches?
The bun build creates a large self-contained executable with no optimisations. Almost like a large electron build.
Deno also provides the same functionality, but with a smaller optimized binary.
Appreciate Bun helping creating healthy competition. I feel like Deno falls under most people's radar often. More security options, faster than Node, built on web standards.
Deno's security options are very useful for AI sandboxes. Broader than node's permissions. Bun badly needs the same.
There's a PR for Bun that gives the same security but it's been sitting for months https://github.com/oven-sh/bun/pull/25911
I want to migrate an existing project to Bun but cannot until it has a security permission system in place.
I was curious:
Maybe I'm missing some flags? Bun's docs say --compile implies --production. I don't see anything in Deno's docs.
Where? bun's doc site search engine doesn't show it but there's an open PR on the topic.
https://github.com/oven-sh/bun/issues/26373
Doc site says: --production sets flag --minify, process.env.NODE_ENV = production, and production-mode JSX import & transform
Might try:
D'oh, it wasn't the doc site. I was lazy:
I actually did double check it though because it used to be wrong. For good measure:
Ideally we would still only use JavaScript on the browser, personally I don't care about about the healthy competition, rather that npm actually works when I am stuck writing server side code I didn't ask for.
FE-BE standardization is efficient in terms of labor and code migration portability, but I really like the idea of static compilation and optimization of the BE in production.. there's absolutely no need or reason for the BE to do dynamic anything in prod. As long as it retains profiling inspectability when things go wrong.
That doesn’t align with my experience. It feels more like a trojan horse. Client and Server rarely (should) share code, and people that are really good at one discipline aren’t that good at the other. Maybe LLMs will change that.
Except we have moved beyond that with SaaS products,agents, AI workflows.
The only reason I touch JavaScript on the backend instead of .NET, Java, Go, Rust, OCaml, Haskell,.... are SDKs and customers that don't let other option than JavaScript all over the place.
Thus my point of view is that I couldn't care less about competition between JavaScript runtimes.
This (single executable) is available in node.js now too as SEA mode.
But I think it still doesn't work with ESM, only CommonJS, so while not insurmountable, not as good as bun.
SEA with node.js "works" for nearly arbitrarily general node code -- pretty much anything you can run with node. However you may have to put in substantial extra effort, e.g., using [1], and possibly more work (e.g., copying assets out or using a virtual file system).
[1] https://www.npmjs.com/package/@vercel/ncc
How much would you get by moving to Go, Rust or C++?
I wish you were getting replies instead of downvotes. I want to know why people think Bun is preferable here. For cross-platform non-performance-important code, I’ll use Bun all day. Once speed enters the equation, I don’t see why you’d still be using it.
A lot, but apparently we cannot get rid of having server side JavaScript code.
I use bun for everything except for monorepos with isolated deployment targets and shared packages. I use yarn or pnpm for monorepos. Maybe it's changed in the last six months but I could never get docker to properly resolve my dependencies when I only want to build the web app, for example, since the bun lock is deterministic based off of all the packages in the repo so isolating a single leaf makes it error.
Maybe I'm doing something wrong but I scoured docs and online and asked multiple AI agents to no avail.
I cringed at those "aaa\0bbb\0ccc\0ddd" Map keys. That's much slower than nested maps and requires allocating the strings, giving GC more work to do.
Creating a custom tuple class to use as key could be faster though. Nested map lookups have less efficient memory access patterns.
So is Bun saying that JSC is much better than v8?
It's more that Zig is faster than JS. The speed advantages of Bun come from all the Zig bindings, not the JS interpreter.
C and C++ as well, and nodejs has bindings.
tl;dr replace SQLite with Map ~ 2x speed up, replace zod validation with ifs ~ 2x speed up. Bun had a memory leak on unresolved promises - now fixed