WebAssembly 101: Three Bets, One Bytecode
A stack-agnostic map of WebAssembly's three distinct bets (browser performance, server-side WASI runtimes, edge compute) so you can tell which one a given Wasm conversation is actually about.
WebAssembly conversations keep talking past each other because a single word covers three very different architectural bets. A frontend engineer hears "makes the web fast," a backend engineer hears "replaces containers," a platform engineer hears "powers the edge," and all three are partially right. This post separates the three so you can tell which one is on the table.
WASM in 30 Seconds
The thesis, before any axis is unpacked: one bytecode, three separate bets. Performance in the browser, a WASI runtime on the server, compute at the edge. Each bet answers a different question.
Source languages like Rust, C, C++, Go, AssemblyScript, and Zig compile down to a binary instruction format: the .wasm file. The bytecode is stack-based, memory-safe, sandboxed by default, and designed to run at near-native speed. It does not care what host it runs on. That host-portability is the whole point; everything else in this post is a story about which host you picked.
The question the rest of the post answers: which of those three hosts are you actually interested in, and why?
Bet 1: In-Browser Performance
The browser bet answers a specific question: JavaScript is not fast enough for this CPU-bound job; can I ship an existing C++ or Rust codebase to the browser tab? The answer has been yes since 2017.
Three cases make this bet real. Figma shipped WebAssembly in 2017 and reported cutting the editor's load time by roughly 3×, compiling its C++ rendering engine to run inside the browser. Google Earth moved its 3D renderer and geospatial math to Wasm to bring the desktop experience to the web. Photoshop Web, built on a multi-year collaboration between Adobe and the Chrome team, compiles Photoshop's C++ core via Emscripten and launched a public web beta in 2021.
The limit worth naming: Wasm has no direct DOM access. Every time Wasm code reaches for a DOM node, an attribute, or an event, the call crosses a JavaScript-to-Wasm bridge. That bridge is not free, and it is the reason "just use Wasm instead of JavaScript" is the wrong mental model. Wasm is for the compute inside your application; JavaScript still owns the interface.
This bet is the most mature of the three. All four major browsers ship it. Open design questions live around Emscripten, memory layout, and multi-threading via SharedArrayBuffer, not whether the platform works.
Bet 2: Server-Side Runtime (WASI)
The server bet answers a different question: can I run isolated, polyglot workloads without booting a full container, with a smaller sandbox and a colder-start profile than Docker offers? The answer is "yes, but the ecosystem is younger than containers."
Vocabulary to introduce once and move on. WASI (the WebAssembly System Interface) is the standardized surface that lets Wasm code talk to files, clocks, sockets, and environment variables without caring which OS is underneath. WASI has two active versions. Preview 1 is the older, POSIX-ish API still widely deployed. Preview 2 shipped on 25 January 2024, was rebuilt on top of the Component Model, and rethinks WASI as a set of typed interfaces rather than syscalls. Preview 3 (WASI 0.3) is already in release-candidate rollout on Wasmtime, with async support and native threads in scope; the 1.0 cut is expected late 2026 or early 2027.
Three platforms make the server bet real. Wasmtime is the reference server-side runtime, stewarded by the Bytecode Alliance and backed by the Cranelift code generator. Wasmer is an alternative runtime with its own packaging and distribution model. wasmCloud is a higher-level platform built on Wasm components and NATS messaging. For a deeper look at the component-portability thesis around wasmCloud and NATS, see wasmCloud + NATS: an event-bus portability bet.
The limit worth naming: the Component Model is still stabilizing. Preview 2 is the stable milestone; Preview 3 is already landing in Wasmtime as a release candidate. Solomon Hykes' 2019 remark that WASM plus WASI in 2008 would have prevented the need for Docker is a good long-term framing, but it is a vision statement, not a 2026 drop-in replacement. Library coverage, language support, and tooling still vary by runtime. A team adopting server-side Wasm today is betting on a maturing ecosystem, not picking up a container-equivalent one.
Bet 3: Edge Compute
The edge bet answers a third question: my users are geographically distributed; I want per-request code to run close to them without paying container cold-start costs. Wasm fits naturally: its cold start is measured in milliseconds, not seconds, and its sandbox is cheap to spin up per request.
Three platforms anchor this bet. Cloudflare Workers pairs V8 isolates with Wasm modules; the official Rust path ships via the workers-rs crate. Fastly Compute is a pure Wasm-based edge runtime with no V8 wrapper in front of it, which gives it a different isolation story and performance profile. Shopify Functions is the customer-extensible commerce-logic platform where merchants ship Wasm modules (compiled from Rust, JavaScript, or AssemblyScript) that run inline in Shopify's checkout and discount paths.
The limit worth naming: the bytecode is portable; the host bindings are not. Each edge platform exposes its own KV store, queue, binding ABI, and runtime shape. A Wasm module built for Cloudflare Workers is not a drag-and-drop deploy to Fastly Compute or Shopify Functions. The portability promise lives at the instruction level, not the platform level. Bindings are where the lock-in hides, which is what the related wasmCloud + NATS portability post explores in depth. For a hands-on look at what shipping a CPU-bound module to the V8-plus-Wasm edge actually involves — and where its three hard limits will bite — see Deploying a WASM Image-Resize Module to Cloudflare Workers.
Which Axis for Which Job
With the three bets named, the decision turns into a short tree. The root question is not "should I use Wasm?"; that framing sends you shopping. The root question is which of three distinct problems describes your current architecture.
The "skip" branch is the stance of this post. If none of the three questions describes your current shape, Wasm is not the answer to today's problem; introducing it now takes on a platform bet for which you do not yet have the use case. The three bets are real, but not universal.
Closing
Start with Bet 1 if the browser is already in your daily surface area and you have a CPU-bound pain the JavaScript engine cannot swallow; it is the most mature of the three and gives the clearest payback. If none of the three questions matches your current architecture, WebAssembly is not the answer to today's problem. Revisit once WASI Preview 3 stabilizes out of release-candidate status and the server-side ecosystem thickens.
References
- WebAssembly.org - The official home of the WebAssembly standard; the authoritative source for what the bytecode is and is not.
- MDN WebAssembly overview - Developer-facing concept index covering JavaScript-to-Wasm integration, memory, and tooling guides.
- WASI.dev - Official WASI portal; names the Preview 1 vs Preview 2 distinction and lists the active runtime ecosystem.
- Bytecode Alliance - Nonprofit stewarding Wasmtime, WASI, and the Component Model; its member list grounds the "who is behind server-side Wasm" question.
- WebAssembly/component-model (GitHub) - Current Component Model repository; the README documents which preview is the active milestone and what is scoped for the next one.
- WASI 0.2 launch — Bytecode Alliance - Confirms 25 January 2024 as the Preview 2 vote date and walks through the Component Model rationale.
- Figma blog — WebAssembly cut Figma's load time by 3× - Evan Wallace's 2017 write-up; the canonical in-browser Wasm case study.
- Photoshop's journey to the web (web.dev) - Adobe and Chrome team's multi-year case study on compiling Photoshop's C++ core via Emscripten.
- Cloudflare Workers — Rust language support - Official documentation for the V8-isolate-plus-Wasm edge model and the
workers-rscrate path. - Fastly Compute — product page - Fastly's first-party positioning of a pure-Wasm edge runtime, useful as a contrast to the V8-wrapped model.