Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Select an option

  • Save casamia918/a23a22296d29116230c41112aec1317e to your computer and use it in GitHub Desktop.

Select an option

Save casamia918/a23a22296d29116230c41112aec1317e to your computer and use it in GitHub Desktop.

React2Shell and the Absence of Web Standards for Partial Rendering

On December 3, 2025, a security vulnerability shook the React ecosystem. CVE-2025-55182, dubbed "React2Shell," received a perfect CVSS score of 10.0. This wasn't just another bug—it revealed a fundamental structural problem facing modern web development.

This article explores why React2Shell happened and how it relates to the absence of web standards for partial rendering.


What Happened

CVE-2025-55182 (CVSS 10.0)

The first vulnerability disclosed on December 3rd was a Remote Code Execution (RCE) flaw in React Server Components (RSC)'s Flight protocol.

The core issue was insecure deserialization. A single unauthenticated HTTP request could completely compromise a server. Even a basic Next.js app created with create-next-app was vulnerable.

Within hours of disclosure, multiple threat actors—including China-nexus groups—actively exploited it, with attacks confirmed in production environments.

CVE-2025-55184 & CVE-2025-55183 (CVSS 7.5 & 5.3)

Days later, on December 11th, security researchers analyzing the initial patch discovered two additional vulnerabilities:

  • CVE-2025-55184: Denial of Service (DoS). Infinite Promise recursion could freeze Node.js servers.
  • CVE-2025-55183: Source code exposure. Calling .toString() on Server Functions could leak hardcoded API keys and secrets.

The initial patch was incomplete, necessitating yet another patch: CVE-2025-67779.


Why Did This Happen?

A thought occurred to me:

"React started as a browser UI library. Did something go wrong when it extended to the server?"

A Custom Protocol Ignoring HTTP Standards

React Server Components created a custom messaging pattern called Flight Protocol. Rather than using standard HTTP/REST APIs, servers and clients communicate using React's proprietary serialization format.

Traditional HTTP works like this:

Client → [HTTP Request] → Server
Server → [JSON/HTML] → Client

✓ Clear boundaries
✓ Proven formats
✓ Established security best practices

But the Flight Protocol:

Client → [React Serialized Payload] → Server
Server → [React Serialized Payload] → Client

✗ Custom serialization format
✗ Transmits JavaScript objects directly
✗ Serializes function references and closures

The "Magic" That Blurred Boundaries

The React team aimed to maximize Developer Experience (DX):

'use server'
async function updateUser(formData) {
  await db.update(formData);
}

// Call from client "as if it were a local function"
<form action={updateUser}>

Making the server-client boundary transparent seemed elegant. But there was a trap.

A fundamental security principle states: "Define clear boundaries and thoroughly validate untrusted input."

React intentionally obscured this boundary, and as a result, both developers and React itself neglected validation at the boundary.

History Repeats Itself

This pattern isn't new:

  1. Java RMI: "Call remote objects like local ones" → Countless deserialization vulnerabilities → Abandoned by most enterprises
  2. PHP's unserialize(): Transmitting object serialization via HTTP → RCE vulnerabilities → Deprecated
  3. SOAP/XML-RPC: Complex type systems and serialization → Security issues → Replaced by REST/JSON

In the trade-off between "developer experience" and "security," high-level abstractions provide convenience but simultaneously blur security boundaries.


The Root Cause: HTTP Is Too Old

But we need to think one step further.

Isn't HTTP itself too outdated to keep pace with the complexity of modern web applications?

The Limitations of HTTP

HTTP is a protocol created in 1991. Its essence:

  • Resource-centric (URL = Resource identifier)
  • Request/Response model
  • Stateless

But modern web apps require:

  • Component-centric (URL ≠ Component)
  • Streaming/Incremental updates
  • Stateful interactions

HTTP is a "document transfer protocol," not an "application protocol."

The Reality of Partial Rendering

Consider when a user adds a single comment:

// Method 1: Full page reload
window.location.reload(); // ← 2000s approach

// Method 2: Fetch JSON and render on client
const comment = await fetch('/api/comments', {method: 'POST'});
setComments([...comments, comment]); // ← React's way

// Method 3: Fetch HTML fragment and insert
const html = await fetch('/api/comments/latest');
container.innerHTML = html; // ← htmx way

// Method 4: RSC - "magic"
// ← React's attempt, but with security issues

All of these are workarounds.

Frameworks Going Their Own Ways

Currently, each framework solves the same problem differently:

  • React: RSC (Flight Protocol)
  • Phoenix: LiveView (WebSocket + diffing)
  • Laravel: Livewire (Ajax + Morphdom)
  • Hotwire: Turbo Streams
  • htmx: HTML over the wire
  • Qwik: Resumability

Each invents its own serialization format, there's no interoperability, and security models vary.

The React incident demonstrates that the absence of standards encourages frameworks to create dangerous custom protocols.


We Need Web Standards for Partial Rendering

I believe we need a web standard for partial rendering.

What the Standard Should Address

1. Component Addressing
   - URLs alone are insufficient
   - Methods to identify specific components within a page
   
2. Incremental Transfer
   - Transfer only changed portions, not the whole
   - Streaming support
   
3. State Synchronization
   - Synchronize state between server and client
   - Handle optimistic updates
   
4. Security Model
   - Which components execute with server privileges?
   - How to validate client input?

Possible Approaches

HTTP Extension:

PATCH /page HTTP/2
Content-Type: application/component-update+json
X-Component-Path: /comments/new

{
  "component": "Comment",
  "props": { "text": "Hello" },
  "action": "append"
}

Or HTML Extension:

<!-- Server response -->
<template component="Comment" action="append" target="#comments">
  <div class="comment">New comment</div>
</template>

The Standardization Dilemma

But creating standards isn't easy:

  1. Who will define it? W3C/WHATWG is slow, and framework communities are fragmented.
  2. The abstraction level problem: Too low-level is hard to use; too high-level lacks flexibility.
  3. Conflicts with existing infrastructure: Will CDN, proxy, cache, WAF, and IDS understand new protocols?

Conclusion

The React2Shell incident isn't just an implementation bug. It resulted from:

  1. HTTP standards failing to keep pace with modern web app requirements
  2. Each framework creating its own solution
  3. Ignoring security principles

React's mistake wasn't the absence of standards—it was ignoring principles (explicit boundaries, input validation) that should be followed regardless of standards. However, if standards had existed, React likely wouldn't have created a dangerous custom protocol.

We need web standards for partial rendering.

Standardization takes time and won't be perfect, but it's better than each framework inventing dangerous protocols. Just as TypeScript became a de facto standard despite not being official, major frameworks could collaborate to find common ground and propose it to browser vendors.

Until then, it's crucial for developers to prioritize "clear boundaries" and "input validation" over the convenience of "magic."


References


This article was written based on conversations with Claude.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment