How do React Server Components(RSC) work internally in React?

1. Recall my rough guess on how React Server Component works internally

For what React Server Component is, please refer to my previous post. I also had a rough guess on how React Server Components work internally with a demo, here are the keypoints of my thoughts:

  1. when building the components,
    • A component map is generated during building phase, so that React can tell whether a component is Client Component or Server Component
    • Server Component is replaced with a thin wrapper on ClientBase during bundling, which is responsible to communicate with server
  2. When rendering React tree on server,
    • recursively renders all elements
      • replace Server Component with Placeholder that forces suspending, attach then callback to continue rendering the sub-tree
      • replace Client Component with LazyContainer
    • send down the result in chunks through streaming, each Server Component leads to a new chunk, identified by unique id
    • built-in components and symbols will be replaced with strings and revived on client.
  3. On client,
    • ClientBase wrapper of Server Components request to server and receives the chunks.
    • upon each chunk, it tries to merge the chunks into one chunk progressively.
    • LazyContainer lazily renders Client Components, Placeholder forces suspense.

Obviously this is NOT the actual approach, I’m a noob software engineer. So let’s dig into the actual implementation of React Server Components today and see if I’m, at least, on the right direction.

2. A look at official demo code

From the official RSC demo, below is how React tree is rendered and streamed.

js
async function renderReactTree(res, props) {
await waitForWebpack();
const manifest = readFileSync(
path.resolve(__dirname, "../build/react-client-manifest.json"),
"utf8"
);
const moduleMap = JSON.parse(manifest);
const { pipe } = renderToPipeableStream(
React.createElement(ReactApp, props),
moduleMap
);
pipe(res);
}
js
async function renderReactTree(res, props) {
await waitForWebpack();
const manifest = readFileSync(
path.resolve(__dirname, "../build/react-client-manifest.json"),
"utf8"
);
const moduleMap = JSON.parse(manifest);
const { pipe } = renderToPipeableStream(
React.createElement(ReactApp, props),
moduleMap
);
pipe(res);
}

The syntax looks similar to what we did.

And on client is some code as below.

jsx
function Router() {
const [cache, setCache] = useState(initialCache);
const [location, setLocation] = useState({
selectedId: null,
isEditing: false,
searchText: "",
});
const locationKey = JSON.stringify(location);
let content = cache.get(locationKey);
if (!content) {
content = createFromFetch(
fetch("/react?location=" + encodeURIComponent(locationKey))
);
cache.set(locationKey, content);
}
function navigate(nextLocation) {
startTransition(() => {
setLocation((loc) => ({
...loc,
...nextLocation
}));
});
}
return (
<RouterContext.Provider >
{use(content)}
</RouterContext.Provider>
);
}
jsx
function Router() {
const [cache, setCache] = useState(initialCache);
const [location, setLocation] = useState({
selectedId: null,
isEditing: false,
searchText: "",
});
const locationKey = JSON.stringify(location);
let content = cache.get(locationKey);
if (!content) {
content = createFromFetch(
fetch("/react?location=" + encodeURIComponent(locationKey))
);
cache.set(locationKey, content);
}
function navigate(nextLocation) {
startTransition(() => {
setLocation((loc) => ({
...loc,
...nextLocation
}));
});
}
return (
<RouterContext.Provider >
{use(content)}
</RouterContext.Provider>
);
}

What the code does is:

  1. Based on the location, fetch the React tree rendered on server, and also cache them. So navigation is simply changing the location in state.
  2. The response is not ready before fetched, use() is to help render Promise.

We’ll start by digging into these 3 functions as entrypoints.

  1. server: renderToPipeableStream().
  2. client: createFromFetch().
  3. client: use().

3. renderToPipeableStream() renders React tree on server

This is the core of React Server Components, it renders the React tree and serialize the result. code

function renderToPipeableStream(
model: ReactClientValue,
webpackMap: ClientManifest,
options?: Options
): PipeableStream {
const request = createRequest(

This is the object to hold all the information,

e.g. model is the React tree. Just like the “Request” in Express.js

middlewares get info from Request and modify it so next middlewares can consume them.

model,
webpackMap,
options ? options.onError : undefined,
options ? options.context : undefined,
options ? options.identifierPrefix : undefined
);
let hasStartedFlowing = false;
startWork(request);

this seems to kick off the whole process

return {
pipe<T: Writable>(destination: T): T {

connects the internal data to output stream.

if (hasStartedFlowing) {
throw new Error(
"React currently only supports piping to one writable stream."
);
}
hasStartedFlowing = true;
startFlowing(request, destination);
destination.on("drain", createDrainHandler(destination, request));
return destination;
},
abort(reason: mixed) {
abort(request, reason);
},
};
}
function startFlowing(request: Request, destination: Destination): void {
if (request.status === CLOSING) {
request.status = CLOSED;
closeWithError(destination, request.fatalError);
return;
}
if (request.status === CLOSED) {
return;
}
if (request.destination !== null) {
// We're already flowing.
return;
}
request.destination = destination;
try {
flushCompletedChunks(request, destination);
} catch (error) {
logRecoverableError(request, error);
fatalError(request, error);
}
}
function renderToPipeableStream(
model: ReactClientValue,
webpackMap: ClientManifest,
options?: Options
): PipeableStream {
const request = createRequest(

This is the object to hold all the information,

e.g. model is the React tree. Just like the “Request” in Express.js

middlewares get info from Request and modify it so next middlewares can consume them.

model,
webpackMap,
options ? options.onError : undefined,
options ? options.context : undefined,
options ? options.identifierPrefix : undefined
);
let hasStartedFlowing = false;
startWork(request);

this seems to kick off the whole process

return {
pipe<T: Writable>(destination: T): T {

connects the internal data to output stream.

if (hasStartedFlowing) {
throw new Error(
"React currently only supports piping to one writable stream."
);
}
hasStartedFlowing = true;
startFlowing(request, destination);
destination.on("drain", createDrainHandler(destination, request));
return destination;
},
abort(reason: mixed) {
abort(request, reason);
},
};
}
function startFlowing(request: Request, destination: Destination): void {
if (request.status === CLOSING) {
request.status = CLOSED;
closeWithError(destination, request.fatalError);
return;
}
if (request.status === CLOSED) {
return;
}
if (request.destination !== null) {
// We're already flowing.
return;
}
request.destination = destination;
try {
flushCompletedChunks(request, destination);
} catch (error) {
logRecoverableError(request, error);
fatalError(request, error);
}
}

It is way more complex than renderServerComponent() in our attempted demo.

3.1 Request is the object to hold all the necessary info about the whole process

It is a bit like the context in our demo, which has a lot of fields.

const request: Request = {
status: OPEN,
fatalError: null,
destination: null,
bundlerConfig,
cache: new Map(),
nextChunkId: 0,

the incrementing unique id to label the chunks, like the context.id in our demo

pendingChunks: 0,

a counter, like we have context.tasks in our demo

abortableTasks: abortSet,
pingedTasks: pingedTasks,
completedImportChunks: ([]: Array<Chunk>),

This is to hold the rendered chunks,

unlike our demo which streams the response to client right away,

here it is only streamed after the destination is set in pipe(),

so needs to store the temporary data in Request

completedJSONChunks: ([]: Array<Chunk>),
completedErrorChunks: ([]: Array<Chunk>),
writtenSymbols: new Map(),
writtenClientReferences: new Map(),
writtenServerReferences: new Map(),
writtenProviders: new Map(),
identifierPrefix: identifierPrefix || "",
identifierCount: 1,
onError: onError === undefined ? defaultErrorHandler : onError,
// $FlowFixMe[missing-this-annot]
toJSON: function (key: string, value: ReactClientValue): ReactJSONValue {

toJSON() is conceptually a bit like serialize() in our demo but way more complex

return resolveModelToJSON(request, this, key, value);
},
};
const request: Request = {
status: OPEN,
fatalError: null,
destination: null,
bundlerConfig,
cache: new Map(),
nextChunkId: 0,

the incrementing unique id to label the chunks, like the context.id in our demo

pendingChunks: 0,

a counter, like we have context.tasks in our demo

abortableTasks: abortSet,
pingedTasks: pingedTasks,
completedImportChunks: ([]: Array<Chunk>),

This is to hold the rendered chunks,

unlike our demo which streams the response to client right away,

here it is only streamed after the destination is set in pipe(),

so needs to store the temporary data in Request

completedJSONChunks: ([]: Array<Chunk>),
completedErrorChunks: ([]: Array<Chunk>),
writtenSymbols: new Map(),
writtenClientReferences: new Map(),
writtenServerReferences: new Map(),
writtenProviders: new Map(),
identifierPrefix: identifierPrefix || "",
identifierCount: 1,
onError: onError === undefined ? defaultErrorHandler : onError,
// $FlowFixMe[missing-this-annot]
toJSON: function (key: string, value: ReactClientValue): ReactJSONValue {

toJSON() is conceptually a bit like serialize() in our demo but way more complex

return resolveModelToJSON(request, this, key, value);
},
};

3.2 startWork(request) kicks off the whole rendering process

js
function startWork(request: Request): void {
if (supportsRequestStorage) {
scheduleWork(() => requestStorage.run(request.cache, performWork, request));
} else {
scheduleWork(() => performWork(request));
}
}
function scheduleWork(callback: () => void) {
setImmediate(callback);
}
js
function startWork(request: Request): void {
if (supportsRequestStorage) {
scheduleWork(() => requestStorage.run(request.cache, performWork, request));
} else {
scheduleWork(() => performWork(request));
}
}
function scheduleWork(callback: () => void) {
setImmediate(callback);
}

We’ve mentioned setImmediate() before in How React Scheduler works, simply put, it is a better version of setTimeout(callback, 0).

Let’s skip the request storage part, performWork() seems to do what we want.

js
function performWork(request: Request): void {
const prevDispatcher = ReactCurrentDispatcher.current;
const prevCache = getCurrentCache();
ReactCurrentDispatcher.current = HooksDispatcher;
setCurrentCache(request.cache);
prepareToUseHooksForRequest(request);
try {
const pingedTasks = request.pingedTasks;
request.pingedTasks = [];
for (let i = 0; i < pingedTasks.length; i++) {
const task = pingedTasks[i];
retryTask(request, task);
}
if (request.destination !== null) {
flushCompletedChunks(request, request.destination);
}
} catch (error) {
logRecoverableError(request, error);
fatalError(request, error);
} finally {
ReactCurrentDispatcher.current = prevDispatcher;
setCurrentCache(prevCache);
resetHooksForRequest();
}
}
js
function performWork(request: Request): void {
const prevDispatcher = ReactCurrentDispatcher.current;
const prevCache = getCurrentCache();
ReactCurrentDispatcher.current = HooksDispatcher;
setCurrentCache(request.cache);
prepareToUseHooksForRequest(request);
try {
const pingedTasks = request.pingedTasks;
request.pingedTasks = [];
for (let i = 0; i < pingedTasks.length; i++) {
const task = pingedTasks[i];
retryTask(request, task);
}
if (request.destination !== null) {
flushCompletedChunks(request, request.destination);
}
} catch (error) {
logRecoverableError(request, error);
fatalError(request, error);
} finally {
ReactCurrentDispatcher.current = prevDispatcher;
setCurrentCache(prevCache);
resetHooksForRequest();
}
}
  1. it tries to run all the tasks. notice the retryTask() is sync here.
  2. then flushCompletedChunks() is called to send the response down to client.

Task is explained right after this.

3.3 pipe() connects the internal data to output stream

Similar to what we did with res.write(), flushCompletedChunks() internally use write() method.

function pipe<T: Writable>(destination: T): T {
if (hasStartedFlowing) {
throw new Error(
'React currently only supports piping to one writable stream.',
);
}
hasStartedFlowing = true;
startFlowing(request, destination);
destination.on('drain', createDrainHandler(destination, request));
return destination;
},
export function startFlowing(request: Request, destination: Destination): void {
if (request.status === CLOSING) {
request.status = CLOSED;
closeWithError(destination, request.fatalError);
return;
}
if (request.status === CLOSED) {
return;
}
if (request.destination !== null) {
// We're already flowing.
return;
}
request.destination = destination;
try {
flushCompletedChunks(request, destination);
} catch (error) {
logRecoverableError(request, error);
fatalError(request, error);
}
}
function writeToDestination(destination: Destination, view: Uint8Array) {
const currentHasCapacity = destination.write(view);

write to stream

destinationHasCapacity = destinationHasCapacity && currentHasCapacity;
}
function pipe<T: Writable>(destination: T): T {
if (hasStartedFlowing) {
throw new Error(
'React currently only supports piping to one writable stream.',
);
}
hasStartedFlowing = true;
startFlowing(request, destination);
destination.on('drain', createDrainHandler(destination, request));
return destination;
},
export function startFlowing(request: Request, destination: Destination): void {
if (request.status === CLOSING) {
request.status = CLOSED;
closeWithError(destination, request.fatalError);
return;
}
if (request.status === CLOSED) {
return;
}
if (request.destination !== null) {
// We're already flowing.
return;
}
request.destination = destination;
try {
flushCompletedChunks(request, destination);
} catch (error) {
logRecoverableError(request, error);
fatalError(request, error);
}
}
function writeToDestination(destination: Destination, view: Uint8Array) {
const currentHasCapacity = destination.write(view);

write to stream

destinationHasCapacity = destinationHasCapacity && currentHasCapacity;
}

3.4 one Task means one Chunk

From createTask(), we can see a Task is to generate a chunk, the id increments once used.

function createTask(
request: Request,
model: ReactClientValue,
context: ContextSnapshot,
abortSet: Set<Task>
): Task {
const id = request.nextChunkId++;

self incrementing id to make sure it is unique

const task: Task = {
id,
status: PENDING,
model,
context,
ping: () => pingTask(request, task),
thenableState: null,
};
abortSet.add(task);
return task;
}
function createTask(
request: Request,
model: ReactClientValue,
context: ContextSnapshot,
abortSet: Set<Task>
): Task {
const id = request.nextChunkId++;

self incrementing id to make sure it is unique

const task: Task = {
id,
status: PENDING,
model,
context,
ping: () => pingTask(request, task),
thenableState: null,
};
abortSet.add(task);
return task;
}

The initial chunk is already scheduled when Response is first created.

export function createRequest(
model: ReactClientValue,
bundlerConfig: ClientManifest,
onError: void | ((error: mixed) => ?string),
context?: Array<[string, ServerContextJSONValue]>,
identifierPrefix?: string,
): Request {
if (
ReactCurrentCache.current !== null &&
ReactCurrentCache.current !== DefaultCacheDispatcher
) {
throw new Error(
'Currently React only supports one RSC renderer at a time.',
);
}
ReactCurrentCache.current = DefaultCacheDispatcher;
const abortSet: Set<Task> = new Set();
const pingedTasks: Array<Task> = [];
const request: Request = {...}
request.pendingChunks++;
const rootContext = createRootContext(context);
const rootTask = createTask(request, model, rootContext, abortSet);

rootTask is the root chunk

pingedTasks.push(rootTask);
return request;
}
export function createRequest(
model: ReactClientValue,
bundlerConfig: ClientManifest,
onError: void | ((error: mixed) => ?string),
context?: Array<[string, ServerContextJSONValue]>,
identifierPrefix?: string,
): Request {
if (
ReactCurrentCache.current !== null &&
ReactCurrentCache.current !== DefaultCacheDispatcher
) {
throw new Error(
'Currently React only supports one RSC renderer at a time.',
);
}
ReactCurrentCache.current = DefaultCacheDispatcher;
const abortSet: Set<Task> = new Set();
const pingedTasks: Array<Task> = [];
const request: Request = {...}
request.pendingChunks++;
const rootContext = createRootContext(context);
const rootTask = createTask(request, model, rootContext, abortSet);

rootTask is the root chunk

pingedTasks.push(rootTask);
return request;
}

3.5 retryTask() - tries to get one chunk ready

The code is a wrapped in a big try...catch. Let’s first look at the try block.

function retryTask(request: Request, task: Task): void {
if (task.status !== PENDING) {
// We completed this by other means before we had a chance to retry it.
return;
}
switchContext(task.context);
try {
let value = task.model;
if (
typeof value === "object" &&
value !== null &&
(value: any).$$typeof === REACT_ELEMENT_TYPE

it only process React Element($$typeof: REACT_ELEMENT_TYPE).

Obviously only React Element needs to be rendered(meaning runing the function)

) {
// TODO: Concatenate keys of parents onto children.
const element: React$Element<any> = (value: any);
// When retrying a component, reuse the thenableState from the
// previous attempt.
const prevThenableState = task.thenableState;
// Attempt to render the Server Component.
// Doing this here lets us reuse this same task if the next component
// also suspends.
task.model = value;
value = attemptResolveElement(

attemptResolveElement() is the actual method to render.

request,
element.type,
element.key,
element.ref,
element.props,
prevThenableState
);
// Successfully finished this component. We're going to keep rendering
// using the same task, but we reset its thenable state before continuing.
task.thenableState = null;
// Keep rendering and reuse the same task. This inner loop is separate
// from the render above because we don't need to reset the thenable state
// until the next time something suspends and retries.
while (
typeof value === "object" &&
value !== null &&
(value: any).$$typeof === REACT_ELEMENT_TYPE
) {

it continues to render if the element tree is just a linked list(one child for each layer),

it looks like a micro improvement, because most of the time React trees are not this simple.

// TODO: Concatenate keys of parents onto children.
const nextElement: React$Element<any> = (value: any);
task.model = value;
value = attemptResolveElement(
request,
nextElement.type,
nextElement.key,
nextElement.ref,
nextElement.props,
null
);
}
}
const processedChunk = processModelChunk(request, task.id, value);
request.completedJSONChunks.push(processedChunk);

processModelChunk() serializes the result and put it in completedJSONChunks,

and as mentioned before completedJSONChunks is actually processed in flushCompletedChunks()

request.abortableTasks.delete(task);
task.status = COMPLETED;
} catch (thrownValue) {
...
}
}
function retryTask(request: Request, task: Task): void {
if (task.status !== PENDING) {
// We completed this by other means before we had a chance to retry it.
return;
}
switchContext(task.context);
try {
let value = task.model;
if (
typeof value === "object" &&
value !== null &&
(value: any).$$typeof === REACT_ELEMENT_TYPE

it only process React Element($$typeof: REACT_ELEMENT_TYPE).

Obviously only React Element needs to be rendered(meaning runing the function)

) {
// TODO: Concatenate keys of parents onto children.
const element: React$Element<any> = (value: any);
// When retrying a component, reuse the thenableState from the
// previous attempt.
const prevThenableState = task.thenableState;
// Attempt to render the Server Component.
// Doing this here lets us reuse this same task if the next component
// also suspends.
task.model = value;
value = attemptResolveElement(

attemptResolveElement() is the actual method to render.

request,
element.type,
element.key,
element.ref,
element.props,
prevThenableState
);
// Successfully finished this component. We're going to keep rendering
// using the same task, but we reset its thenable state before continuing.
task.thenableState = null;
// Keep rendering and reuse the same task. This inner loop is separate
// from the render above because we don't need to reset the thenable state
// until the next time something suspends and retries.
while (
typeof value === "object" &&
value !== null &&
(value: any).$$typeof === REACT_ELEMENT_TYPE
) {

it continues to render if the element tree is just a linked list(one child for each layer),

it looks like a micro improvement, because most of the time React trees are not this simple.

// TODO: Concatenate keys of parents onto children.
const nextElement: React$Element<any> = (value: any);
task.model = value;
value = attemptResolveElement(
request,
nextElement.type,
nextElement.key,
nextElement.ref,
nextElement.props,
null
);
}
}
const processedChunk = processModelChunk(request, task.id, value);
request.completedJSONChunks.push(processedChunk);

processModelChunk() serializes the result and put it in completedJSONChunks,

and as mentioned before completedJSONChunks is actually processed in flushCompletedChunks()

request.abortableTasks.delete(task);
task.status = COMPLETED;
} catch (thrownValue) {
...
}
}

Two things need to be pointed out.

  1. We don’t see recursion here, attemptResolveElement() is called only for one layer (except the special case). The trick actually happens inside of serializing, we’ll come back at it soon.
  2. flushCompletedChunks() is not here, rather it is outside of retryTask() to batch streaming multiple chunks. The rescheduling of the task is in the catch block, explained below.
} catch (thrownValue) {
const x =
thrownValue === SuspenseException
? // This is a special type of exception used for Suspense. For historical
// reasons, the rest of the Suspense implementation expects the thrown
// value to be a thenable, because before `use` existed that was the
// (unstable) API for suspending. This implementation detail can change
// later, once we deprecate the old API in favor of `use`.
getSuspendedThenable()
: thrownValue;
// $FlowFixMe[method-unbinding]
if (typeof x === 'object' && x !== null && typeof x.then === 'function') {
--------------------

checks for thenable

// Something suspended again, let's pick it back up later.
const ping = task.ping;
x.then(ping, ping);
task.thenableState = getThenableStateAfterSuspending();
return;
} else {
request.abortableTasks.delete(task);
task.status = ERRORED;
const digest = logRecoverableError(request, x);
if (__DEV__) {
const {message, stack} = getErrorMessageAndStackDev(x);
emitErrorChunkDev(request, task.id, digest, message, stack);
} else {
emitErrorChunkProd(request, task.id, digest);
}
}
}
} catch (thrownValue) {
const x =
thrownValue === SuspenseException
? // This is a special type of exception used for Suspense. For historical
// reasons, the rest of the Suspense implementation expects the thrown
// value to be a thenable, because before `use` existed that was the
// (unstable) API for suspending. This implementation detail can change
// later, once we deprecate the old API in favor of `use`.
getSuspendedThenable()
: thrownValue;
// $FlowFixMe[method-unbinding]
if (typeof x === 'object' && x !== null && typeof x.then === 'function') {
--------------------

checks for thenable

// Something suspended again, let's pick it back up later.
const ping = task.ping;
x.then(ping, ping);
task.thenableState = getThenableStateAfterSuspending();
return;
} else {
request.abortableTasks.delete(task);
task.status = ERRORED;
const digest = logRecoverableError(request, x);
if (__DEV__) {
const {message, stack} = getErrorMessageAndStackDev(x);
emitErrorChunkDev(request, task.id, digest, message, stack);
} else {
emitErrorChunkProd(request, task.id, digest);
}
}
}

We can see the check for thenable, it sets up a then callback to re-run the task. ping is set during createTask(), which is just scheduling to run the task again.

function pingTask(request: Request, task: Task): void {
const pingedTasks = request.pingedTasks;
pingedTasks.push(task);
if (pingedTasks.length === 1) {
scheduleWork(() => performWork(request));

Notice the task is not run directly,

but run from the root call of performWork(),

to make sure the flushCompletedChunks() is run as well.

}
}
function pingTask(request: Request, task: Task): void {
const pingedTasks = request.pingedTasks;
pingedTasks.push(task);
if (pingedTasks.length === 1) {
scheduleWork(() => performWork(request));

Notice the task is not run directly,

but run from the root call of performWork(),

to make sure the flushCompletedChunks() is run as well.

}
}

So the picture becomes clear now.

  1. Request hold the tasks, which means the jobs to generate Chunks.
  2. performWork() tries to run the tasks and flush the completed chunks.
  3. In each task, if promise is thrown, it schedules performWork() to repeat the process again.

You might wonder: where do the new tasks get scheduled for server components? I only see tasks get re-run. Hang on, we’ll soon get the answer.

3.6 attemptResolveElement() process just one element, renders if necessary

Since it only processes one element, it is perfect for recursion. It is called resolve since we only need to render(run the function) function components.

The function is quite big, let’s break it down.

3.6.1 Server Component - wrapped in REACT_LAZY_TYPE

function attemptResolveElement(
request: Request,
type: any,
key: null | React$Key,
ref: mixed,
props: any,
prevThenableState: ThenableState | null
): ReactClientValue {
if (typeof type === "function") {
if (isClientReference(type)) {
// This is a reference to a Client Component.
return [REACT_ELEMENT_TYPE, type, key, props];

The array is a simplified form of the React element object notation,

to generate more compact serialization result,

without the repeated keys of: "$$typeof", "type" .etc.

}
// This is a server-side component.
prepareToUseHooksForComponent(prevThenableState);
const result = type(props);
if (
typeof result === "object" &&
result !== null &&
typeof result.then === "function"
) {
// When the return value is in children position we can resolve it immediately,
// to its value without a wrapper if it's synchronously available.
const thenable: Thenable<any> = result;
if (thenable.status === "fulfilled") {
return thenable.value;
}
// TODO: Once we accept Promises as children on the client, we can just return
// the thenable here.
return createLazyWrapperAroundWakeable(result);
}
return result;
} else ...
function attemptResolveElement(
request: Request,
type: any,
key: null | React$Key,
ref: mixed,
props: any,
prevThenableState: ThenableState | null
): ReactClientValue {
if (typeof type === "function") {
if (isClientReference(type)) {
// This is a reference to a Client Component.
return [REACT_ELEMENT_TYPE, type, key, props];

The array is a simplified form of the React element object notation,

to generate more compact serialization result,

without the repeated keys of: "$$typeof", "type" .etc.

}
// This is a server-side component.
prepareToUseHooksForComponent(prevThenableState);
const result = type(props);
if (
typeof result === "object" &&
result !== null &&
typeof result.then === "function"
) {
// When the return value is in children position we can resolve it immediately,
// to its value without a wrapper if it's synchronously available.
const thenable: Thenable<any> = result;
if (thenable.status === "fulfilled") {
return thenable.value;
}
// TODO: Once we accept Promises as children on the client, we can just return
// the thenable here.
return createLazyWrapperAroundWakeable(result);
}
return result;
} else ...

This is actually the branch of handling function components.

  1. it checks if the component is client by isClientReference(). If it is , then return [REACT_ELEMENT_TYPE, type, key, props].
  2. if it is Server Component, just run it and return the result.
    • if it returns a promise and not fulfilled, it createLazyWrapperAroundWakeable().

Let’s see what is the wrapper.

function createLazyWrapperAroundWakeable(wakeable: Wakeable) {
// This is a temporary fork of the `use` implementation until we accept
// promises everywhere.
const thenable: Thenable<mixed> = (wakeable: any);
switch (thenable.status) {
case "fulfilled":
case "rejected":
break;
default: {
if (typeof thenable.status === "string") {
// Only instrument the thenable if the status if not defined. If
// it's defined, but an unknown value, assume it's been instrumented by
// some custom userspace implementation. We treat it as "pending".
break;
}
const pendingThenable: PendingThenable<mixed> = (thenable: any);
pendingThenable.status = "pending";
pendingThenable.then(
(fulfilledValue) => {
if (thenable.status === "pending") {
const fulfilledThenable: FulfilledThenable<mixed> = (thenable: any);
fulfilledThenable.status = "fulfilled";
fulfilledThenable.value = fulfilledValue;
}
},
(error: mixed) => {
if (thenable.status === "pending") {
const rejectedThenable: RejectedThenable<mixed> = (thenable: any);
rejectedThenable.status = "rejected";
rejectedThenable.reason = error;
}
}
);
break;
}
}
const lazyType: LazyComponent<any, Thenable<any>> = {
$$typeof: REACT_LAZY_TYPE,
_payload: thenable,
_init: readThenable,
};
return lazyType;
}
function readThenable<T>(thenable: Thenable<T>): T {
if (thenable.status === "fulfilled") {
return thenable.value;
} else if (thenable.status === "rejected") {
throw thenable.reason;
}
throw thenable;

it throws if thenable is not readdy

}
function createLazyWrapperAroundWakeable(wakeable: Wakeable) {
// This is a temporary fork of the `use` implementation until we accept
// promises everywhere.
const thenable: Thenable<mixed> = (wakeable: any);
switch (thenable.status) {
case "fulfilled":
case "rejected":
break;
default: {
if (typeof thenable.status === "string") {
// Only instrument the thenable if the status if not defined. If
// it's defined, but an unknown value, assume it's been instrumented by
// some custom userspace implementation. We treat it as "pending".
break;
}
const pendingThenable: PendingThenable<mixed> = (thenable: any);
pendingThenable.status = "pending";
pendingThenable.then(
(fulfilledValue) => {
if (thenable.status === "pending") {
const fulfilledThenable: FulfilledThenable<mixed> = (thenable: any);
fulfilledThenable.status = "fulfilled";
fulfilledThenable.value = fulfilledValue;
}
},
(error: mixed) => {
if (thenable.status === "pending") {
const rejectedThenable: RejectedThenable<mixed> = (thenable: any);
rejectedThenable.status = "rejected";
rejectedThenable.reason = error;
}
}
);
break;
}
}
const lazyType: LazyComponent<any, Thenable<any>> = {
$$typeof: REACT_LAZY_TYPE,
_payload: thenable,
_init: readThenable,
};
return lazyType;
}
function readThenable<T>(thenable: Thenable<T>): T {
if (thenable.status === "fulfilled") {
return thenable.value;
} else if (thenable.status === "rejected") {
throw thenable.reason;
}
throw thenable;

it throws if thenable is not readdy

}

OK, so the wrapper is trying to set status and value directly on the promise, and it returns a REACT_LAZY_TYPE (React.lazy()). Notice that the _init is set to readThenable which throws if the thenable is not fulfilled. To understand it easier, let’s fast-forward and see how REACT_LAZY_TYPE is handled first.

Refer to section 3.9 to see how to tell if component is client or server

3.6.2 REACT_LAZY_TYPE - throws Promise if not ready

js
else if (type != null && typeof type === 'object') {
switch (type.$$typeof) {
case REACT_LAZY_TYPE: {
const payload = type._payload;
const init = type._init;
const wrappedType = init(payload);
return attemptResolveElement(
request,
wrappedType,
key,
ref,
props,
prevThenableState,
);
}
}
}
js
else if (type != null && typeof type === 'object') {
switch (type.$$typeof) {
case REACT_LAZY_TYPE: {
const payload = type._payload;
const init = type._init;
const wrappedType = init(payload);
return attemptResolveElement(
request,
wrappedType,
key,
ref,
props,
prevThenableState,
);
}
}
}

We can see that when REACT_LAZY_TYPE is serialized, init(payload) is run and promise is thown. This means that async functions(that do not get fulfilled right away) throws automatically during serialization. Nice trick!

What is even cooler is that, remember replacer’s return value will continue to be stringified, so the serialized result will be serialized, leading to this piece of code. When a new task is scheduled because of the thrown value, it will try to resolve the REACT_LAZY_TYPE, no longer the original server component. This REACT_LAZY_TYPE now holds the promise so the server component is not re-run actually. This kind of makes the work idempotent?

3.6.3 Client Component - no rendering, prepare to serialize

js
else if (type != null && typeof type === 'object') {
if (isClientReference(type)) {
// This is a reference to a Client Component.
return [REACT_ELEMENT_TYPE, type, key, props];
}
}
js
else if (type != null && typeof type === 'object') {
if (isClientReference(type)) {
// This is a reference to a Client Component.
return [REACT_ELEMENT_TYPE, type, key, props];
}
}

We’ve seen how client module is tagged automatically, see here it just return the type for serialization, it is then serialized as lazy chunk.

Wait, in section 3.6.1, we see isClientReference() is called against function, here again checked on object?

Yes, most of the case it is object. Support for function is added for some tooling support (original PR).

3.6.4 Intrinsic HTML tag - serializble as it is

js
else if (typeof type === "string") {
// This is a host element. E.g. HTML.
return [REACT_ELEMENT_TYPE, type, key, props];
}
js
else if (typeof type === "string") {
// This is a host element. E.g. HTML.
return [REACT_ELEMENT_TYPE, type, key, props];
}

This is for intrinsic HTML tags, simple, because it is already serializable.

3.6.5 Built-in components with Symbols - no rendering, prepare to serialize

js
else if (typeof type === "symbol") {
if (type === REACT_FRAGMENT_TYPE) {
// For key-less fragments, we add a small optimization to avoid serializing
// it as a wrapper.
// TODO: If a key is specified, we should propagate its key to any children.
// Same as if a Server Component has a key.
return props.children;
}
// This might be a built-in React component. We'll let the client decide.
// Any built-in works as long as its props are serializable.
return [REACT_ELEMENT_TYPE, type, key, props];
}
js
else if (typeof type === "symbol") {
if (type === REACT_FRAGMENT_TYPE) {
// For key-less fragments, we add a small optimization to avoid serializing
// it as a wrapper.
// TODO: If a key is specified, we should propagate its key to any children.
// Same as if a Server Component has a key.
return props.children;
}
// This might be a built-in React component. We'll let the client decide.
// Any built-in works as long as its props are serializable.
return [REACT_ELEMENT_TYPE, type, key, props];
}

For built-in components, they are serializable if Symbol is somehow replaced and revived, we’ll see it soon.

There are other components being handled like Context .etc. we’ll skip them for now to narrow down the scope.

3.7 processModelChunk() continues rendering deeper in the tree and serialize at the same time

retryTask() only renders the React tree for one layer, due to that rendering of a React tree is lazy, once it is done, we can send down the completed chunk right away. processModelChunk() does it, but we can see that inside of it, it continues rendering in the deeper tree.

export function processModelChunk(
request: Request,
id: number,
model: ReactClientValue,
): Chunk {
// $FlowFixMe[incompatible-type] stringify can return null
const json: string = stringify(model, request.toJSON);

toJSON() works as a replacer for JSON.stringify()

const row = id.toString(16) + ':' + json + '\n';
return stringToChunk(row);
}
const request: Request = {
...
toJSON: function (key: string, value: ReactClientValue): ReactJSONValue {
return resolveModelToJSON(request, this, key, value);
}
}
export function processModelChunk(
request: Request,
id: number,
model: ReactClientValue,
): Chunk {
// $FlowFixMe[incompatible-type] stringify can return null
const json: string = stringify(model, request.toJSON);

toJSON() works as a replacer for JSON.stringify()

const row = id.toString(16) + ':' + json + '\n';
return stringToChunk(row);
}
const request: Request = {
...
toJSON: function (key: string, value: ReactClientValue): ReactJSONValue {
return resolveModelToJSON(request, this, key, value);
}
}

3.7.1. Let’s recall how JSON.stringify() replacer works()

The second argument of JSON.stringify() could be a replacer function that allows us to customize the string value based on the key.

One interesting part of the replacer function is that we can return an object, JSON.stringify() will recursively work on the object as well, here is an example.

JSON.stringify({ a: 3 }, (k, v) => {
if (k === "a") {
return { b: 4 };
}
if (k === "b") {
return { c: 5 };
}
return v;
});
// '{"a":{"b":{"c":5}}}'
JSON.stringify({ a: 3 }, (k, v) => {
if (k === "a") {
return { b: 4 };
}
if (k === "b") {
return { c: 5 };
}
return v;
});
// '{"a":{"b":{"c":5}}}'

See we use the replacer to change the original object and stringified it.

Look how it suits React tree rendering because of the lazy rendering, we can use the replacer to run the function component when we fin it appropriate. Clever, right?

Keep this in mind and let’s continue.

3.7.2 resolveModelToJSON() does the actual serialization, as a replacer of JSON.stringify()

Per its name, resolveModelToJSON() stringify rendered React tree, it is quite long, let’s break it down

3.7.2.1 REACT_ELEMENT_TYPE → ’$’
export function resolveModelToJSON(
request: Request,
parent:
| {+[key: string | number]: ReactClientValue}
| $ReadOnlyArray<ReactClientValue>,
key: string,
value: ReactClientValue,
): ReactJSONValue {
// Special Symbols
switch (value) {
case REACT_ELEMENT_TYPE:
return '$';

So Symbol.for("react.element") is replaced with $, well much short than our choice.

It doesn't mean anything, guess chose the shortest because

there are usually a lot of Symbol.for("react.element") to serialize

}
export function resolveModelToJSON(
request: Request,
parent:
| {+[key: string | number]: ReactClientValue}
| $ReadOnlyArray<ReactClientValue>,
key: string,
value: ReactClientValue,
): ReactJSONValue {
// Special Symbols
switch (value) {
case REACT_ELEMENT_TYPE:
return '$';

So Symbol.for("react.element") is replaced with $, well much short than our choice.

It doesn't mean anything, guess chose the shortest because

there are usually a lot of Symbol.for("react.element") to serialize

}
3.7.2.2 Server Component - serialized as lazy chunk, meanwhile a new Task is scheduled
// Resolve Server Components.
while (
typeof value === "object" &&
value !== null &&
((value: any).$$typeof === REACT_ELEMENT_TYPE ||
(value: any).$$typeof === REACT_LAZY_TYPE)
) {
try {
switch ((value: any).$$typeof) {
case REACT_ELEMENT_TYPE: {
// TODO: Concatenate keys of parents onto children.
const element: React$Element<any> = (value: any);
// Attempt to render the Server Component.
value = attemptResolveElement(
request,
element.type,
element.key,
element.ref,
element.props,
null
);
break;
}
case REACT_LAZY_TYPE: {
const payload = (value: any)._payload;
const init = (value: any)._init;
value = init(payload);

look how init is called here for REACT_LAZY_TYPE

remember that when server component is first met, it is replaced with REACT_LAZY_TYPE

and it is then processed here

break;
}
}
} catch (thrownValue) {

as we mentioned, init() throws

const x =
thrownValue === SuspenseException
? // This is a special type of exception used for Suspense. For historical
// reasons, the rest of the Suspense implementation expects the thrown
// value to be a thenable, because before `use` existed that was the
// (unstable) API for suspending. This implementation detail can change
// later, once we deprecate the old API in favor of `use`.
getSuspendedThenable()
: thrownValue;
// $FlowFixMe[method-unbinding]
if (typeof x === "object" && x !== null && typeof x.then === "function") {
// Something suspended, we'll need to create a new task and resolve it later.
request.pendingChunks++;
const newTask = createTask(
request,
value,
getActiveContext(),
request.abortableTasks
);
const ping = newTask.ping;
x.then(ping, ping);
newTask.thenableState = getThenableStateAfterSuspending();
return serializeLazyID(newTask.id);
} else {
// Something errored. We'll still send everything we have up until this point.
// We'll replace this element with a lazy reference that throws on the client
// once it gets rendered.
request.pendingChunks++;
const errorId = request.nextChunkId++;
const digest = logRecoverableError(request, x);
emitErrorChunkProd(request, errorId, digest);
return serializeLazyID(errorId);
}
}
}
// Resolve Server Components.
while (
typeof value === "object" &&
value !== null &&
((value: any).$$typeof === REACT_ELEMENT_TYPE ||
(value: any).$$typeof === REACT_LAZY_TYPE)
) {
try {
switch ((value: any).$$typeof) {
case REACT_ELEMENT_TYPE: {
// TODO: Concatenate keys of parents onto children.
const element: React$Element<any> = (value: any);
// Attempt to render the Server Component.
value = attemptResolveElement(
request,
element.type,
element.key,
element.ref,
element.props,
null
);
break;
}
case REACT_LAZY_TYPE: {
const payload = (value: any)._payload;
const init = (value: any)._init;
value = init(payload);

look how init is called here for REACT_LAZY_TYPE

remember that when server component is first met, it is replaced with REACT_LAZY_TYPE

and it is then processed here

break;
}
}
} catch (thrownValue) {

as we mentioned, init() throws

const x =
thrownValue === SuspenseException
? // This is a special type of exception used for Suspense. For historical
// reasons, the rest of the Suspense implementation expects the thrown
// value to be a thenable, because before `use` existed that was the
// (unstable) API for suspending. This implementation detail can change
// later, once we deprecate the old API in favor of `use`.
getSuspendedThenable()
: thrownValue;
// $FlowFixMe[method-unbinding]
if (typeof x === "object" && x !== null && typeof x.then === "function") {
// Something suspended, we'll need to create a new task and resolve it later.
request.pendingChunks++;
const newTask = createTask(
request,
value,
getActiveContext(),
request.abortableTasks
);
const ping = newTask.ping;
x.then(ping, ping);
newTask.thenableState = getThenableStateAfterSuspending();
return serializeLazyID(newTask.id);
} else {
// Something errored. We'll still send everything we have up until this point.
// We'll replace this element with a lazy reference that throws on the client
// once it gets rendered.
request.pendingChunks++;
const errorId = request.nextChunkId++;
const digest = logRecoverableError(request, x);
emitErrorChunkProd(request, errorId, digest);
return serializeLazyID(errorId);
}
}
}

Here is how we process React Element, same as retryTask(), attemptResolveElement() is called. So if Server Component is met, a thenable is thrown and caught here. The handling of thrown promise is different from retryTask() though, as blow.

request.pendingChunks++;
const newTask = createTask(
request,
value,

This value is the current node on the tree,

so the task scheduled here will try to render and serialize ths sub-tree later

getActiveContext(),
request.abortableTasks
);
const ping = newTask.ping;
x.then(ping, ping);
newTask.thenableState = getThenableStateAfterSuspending();
return serializeLazyID(newTask.id);
function serializeLazyID(id: number): string {
return "$L" + id.toString(16);

I guess L means Lazy?

}
request.pendingChunks++;
const newTask = createTask(
request,
value,

This value is the current node on the tree,

so the task scheduled here will try to render and serialize ths sub-tree later

getActiveContext(),
request.abortableTasks
);
const ping = newTask.ping;
x.then(ping, ping);
newTask.thenableState = getThenableStateAfterSuspending();
return serializeLazyID(newTask.id);
function serializeLazyID(id: number): string {
return "$L" + id.toString(16);

I guess L means Lazy?

}

We see a new task is created, meaning there is going to be a new chunk. Chunk is identified with id, which means it just returns $L{id}, similar to what we did as <Placeholder id="xx"/>.

We’ll soon see how $L{id} is parsed and replaced with a Lazy component on client.

3.7.2.3 null → null
js
if (value === null) {
return null;
}
js
if (value === null) {
return null;
}

null is just serialized as null.

3.7.2.4 Client Component - serialized as lazy chunk as well
if (typeof value === "object") {
if (isClientReference(value)) {
return serializeClientReference(request, parent, key, (value: any));
}
function serializeClientReference(
request: Request,
parent:
| { +[key: string | number]: ReactClientValue }
| $ReadOnlyArray<ReactClientValue>,
key: string,
clientReference: ClientReference<any>
): string {
const clientReferenceKey: ClientReferenceKey =
getClientReferenceKey(clientReference);
const writtenClientReferences = request.writtenClientReferences;
const existingId = writtenClientReferences.get(clientReferenceKey);
if (existingId !== undefined) {

This checks if the client reference is already streamed

We could use it multiple times, but we only want to preload it once

if (parent[0] === REACT_ELEMENT_TYPE && key === "1") {
// If we're encoding the "type" of an element, we can refer
// to that by a lazy reference instead of directly since React
// knows how to deal with lazy values. This lets us suspend
// on this component rather than its parent until the code has
// loaded.
return serializeLazyID(existingId);
}
return serializeByValueID(existingId);
}
try {
const clientReferenceMetadata: ClientReferenceMetadata =
resolveClientReferenceMetadata(request.bundlerConfig, clientReference);
request.pendingChunks++;
const importId = request.nextChunkId++;
emitImportChunk(request, importId, clientReferenceMetadata);
writtenClientReferences.set(clientReferenceKey, importId);
if (parent[0] === REACT_ELEMENT_TYPE && key === "1") {

This checks if this component reference is rendered as element,

rather than being passed as props

// If we're encoding the "type" of an element, we can refer
// to that by a lazy reference instead of directly since React
// knows how to deal with lazy values. This lets us suspend
// on this component rather than its parent until the code has
// loaded.
return serializeLazyID(importId);
}
return serializeByValueID(importId);
} catch (x) {
request.pendingChunks++;
const errorId = request.nextChunkId++;
const digest = logRecoverableError(request, x);
emitErrorChunkProd(request, errorId, digest);
return serializeByValueID(errorId);
}
}
function getClientReferenceKey(
reference: ClientReference<any>
): ClientReferenceKey {
return reference.$$async ? reference.$$id + "#async" : reference.$$id;
}
if (typeof value === "object") {
if (isClientReference(value)) {
return serializeClientReference(request, parent, key, (value: any));
}
function serializeClientReference(
request: Request,
parent:
| { +[key: string | number]: ReactClientValue }
| $ReadOnlyArray<ReactClientValue>,
key: string,
clientReference: ClientReference<any>
): string {
const clientReferenceKey: ClientReferenceKey =
getClientReferenceKey(clientReference);
const writtenClientReferences = request.writtenClientReferences;
const existingId = writtenClientReferences.get(clientReferenceKey);
if (existingId !== undefined) {

This checks if the client reference is already streamed

We could use it multiple times, but we only want to preload it once

if (parent[0] === REACT_ELEMENT_TYPE && key === "1") {
// If we're encoding the "type" of an element, we can refer
// to that by a lazy reference instead of directly since React
// knows how to deal with lazy values. This lets us suspend
// on this component rather than its parent until the code has
// loaded.
return serializeLazyID(existingId);
}
return serializeByValueID(existingId);
}
try {
const clientReferenceMetadata: ClientReferenceMetadata =
resolveClientReferenceMetadata(request.bundlerConfig, clientReference);
request.pendingChunks++;
const importId = request.nextChunkId++;
emitImportChunk(request, importId, clientReferenceMetadata);
writtenClientReferences.set(clientReferenceKey, importId);
if (parent[0] === REACT_ELEMENT_TYPE && key === "1") {

This checks if this component reference is rendered as element,

rather than being passed as props

// If we're encoding the "type" of an element, we can refer
// to that by a lazy reference instead of directly since React
// knows how to deal with lazy values. This lets us suspend
// on this component rather than its parent until the code has
// loaded.
return serializeLazyID(importId);
}
return serializeByValueID(importId);
} catch (x) {
request.pendingChunks++;
const errorId = request.nextChunkId++;
const digest = logRecoverableError(request, x);
emitErrorChunkProd(request, errorId, digest);
return serializeByValueID(errorId);
}
}
function getClientReferenceKey(
reference: ClientReference<any>
): ClientReferenceKey {
return reference.$$async ? reference.$$id + "#async" : reference.$$id;
}

Above is how client component is serialized. So basically,

  1. it is identified by the unique id
  2. emitImportChunk() adds the chunk into completedImportChunks, which will be flushed at right timing.
  3. then serialize with $L or just $. If it is passed as just a prop, not element type, there is no need to preload it.
  4. if it is already emitted, the serialized will be reused without sending down the chunk again.

Despite a lot of details, I think our approach is similar in the direction. We replaced it as <LazyContainer componentName="xxx">.

In Secion 3.9, we see how the reference is created automatically, here we see that resolveClientReferenceMetadata() helps resolve the final module info together with bundlerConfig, which is passed in renderToPipeableStream(). This is bundler specific, we’ll skip it for now, just know that a module path is resolved to the final built module id .etc, which is to be consumed on client.

js
function resolveClientReferenceMetadata<T>(
config: ClientManifest,
clientReference: ClientReference<T>,
): ClientReferenceMetadata {
const modulePath = clientReference.$$id;
let name = '';
let resolvedModuleData = config[modulePath];
if (resolvedModuleData) {
// The potentially aliased name.
name = resolvedModuleData.name;
} else {
// We didn't find this specific export name but we might have the * export
// which contains this name as well.
// TODO: It's unfortunate that we now have to parse this string. We should
// probably go back to encoding path and name separately on the client reference.
const idx = modulePath.lastIndexOf('#');
if (idx !== -1) {
name = modulePath.substr(idx + 1);
resolvedModuleData = config[modulePath.substr(0, idx)];
}
if (!resolvedModuleData) {
throw new Error(
'Could not find the module "' +
modulePath +
'" in the React Client Manifest. ' +
'This is probably a bug in the React Server Components bundler.',
);
}
}
return {
id: resolvedModuleData.id,
chunks: resolvedModuleData.chunks,
name: name,
async: !!clientReference.$$async,
};
}
js
function resolveClientReferenceMetadata<T>(
config: ClientManifest,
clientReference: ClientReference<T>,
): ClientReferenceMetadata {
const modulePath = clientReference.$$id;
let name = '';
let resolvedModuleData = config[modulePath];
if (resolvedModuleData) {
// The potentially aliased name.
name = resolvedModuleData.name;
} else {
// We didn't find this specific export name but we might have the * export
// which contains this name as well.
// TODO: It's unfortunate that we now have to parse this string. We should
// probably go back to encoding path and name separately on the client reference.
const idx = modulePath.lastIndexOf('#');
if (idx !== -1) {
name = modulePath.substr(idx + 1);
resolvedModuleData = config[modulePath.substr(0, idx)];
}
if (!resolvedModuleData) {
throw new Error(
'Could not find the module "' +
modulePath +
'" in the React Client Manifest. ' +
'This is probably a bug in the React Server Components bundler.',
);
}
}
return {
id: resolvedModuleData.id,
chunks: resolvedModuleData.chunks,
name: name,
async: !!clientReference.$$async,
};
}
3.7.2.5 Thenable(Promies) - serialized similar to Lazy Chunk
js
else if (typeof value.then === "function") {
// We assume that any object with a .then property is a "Thenable" type,
// or a Promise type. Either of which can be represented by a Promise.
const promiseId = serializeThenable(request, (value: any));
return serializePromiseID(promiseId);
}
js
else if (typeof value.then === "function") {
// We assume that any object with a .then property is a "Thenable" type,
// or a Promise type. Either of which can be represented by a Promise.
const promiseId = serializeThenable(request, (value: any));
return serializePromiseID(promiseId);
}

Interesting that there is serializeThenable(), because we’ve already seen that Server Component will trigger a Promise to be thrown.

So the promise here is the general term, like we can have a promise passed as component props, not an async component, but async prop. This is cool and we didn’t cover it in our demo.

Let’s see what it is in serializeThenable().

js
function serializeThenable(request: Request, thenable: Thenable<any>): number {
request.pendingChunks++;
const newTask = createTask(
request,
null,
getActiveContext(),
request.abortableTasks
);
switch (thenable.status) {
case "fulfilled": {
// We have the resolved value, we can go ahead and schedule it for serialization.
newTask.model = thenable.value;
pingTask(request, newTask);
return newTask.id;
}
case "rejected": {
const x = thenable.reason;
const digest = logRecoverableError(request, x);
emitErrorChunkProd(request, newTask.id, digest);
return newTask.id;
}
default: {
if (typeof thenable.status === "string") {
// Only instrument the thenable if the status if not defined. If
// it's defined, but an unknown value, assume it's been instrumented by
// some custom userspace implementation. We treat it as "pending".
break;
}
const pendingThenable: PendingThenable<mixed> = (thenable: any);
pendingThenable.status = "pending";
pendingThenable.then(
(fulfilledValue) => {
if (thenable.status === "pending") {
const fulfilledThenable: FulfilledThenable<mixed> = (thenable: any);
fulfilledThenable.status = "fulfilled";
fulfilledThenable.value = fulfilledValue;
}
},
(error: mixed) => {
if (thenable.status === "pending") {
const rejectedThenable: RejectedThenable<mixed> = (thenable: any);
rejectedThenable.status = "rejected";
rejectedThenable.reason = error;
}
}
);
break;
}
}
thenable.then(
(value) => {
newTask.model = value;
pingTask(request, newTask);
},
(reason) => {
newTask.status = ERRORED;
// TODO: We should ideally do this inside performWork so it's scheduled
const digest = logRecoverableError(request, reason);
emitErrorChunkProd(request, newTask.id, digest);
if (request.destination !== null) {
flushCompletedChunks(request, request.destination);
}
}
);
return newTask.id;
}
function serializePromiseID(id: number): string {
return "$@" + id.toString(16);
}
js
function serializeThenable(request: Request, thenable: Thenable<any>): number {
request.pendingChunks++;
const newTask = createTask(
request,
null,
getActiveContext(),
request.abortableTasks
);
switch (thenable.status) {
case "fulfilled": {
// We have the resolved value, we can go ahead and schedule it for serialization.
newTask.model = thenable.value;
pingTask(request, newTask);
return newTask.id;
}
case "rejected": {
const x = thenable.reason;
const digest = logRecoverableError(request, x);
emitErrorChunkProd(request, newTask.id, digest);
return newTask.id;
}
default: {
if (typeof thenable.status === "string") {
// Only instrument the thenable if the status if not defined. If
// it's defined, but an unknown value, assume it's been instrumented by
// some custom userspace implementation. We treat it as "pending".
break;
}
const pendingThenable: PendingThenable<mixed> = (thenable: any);
pendingThenable.status = "pending";
pendingThenable.then(
(fulfilledValue) => {
if (thenable.status === "pending") {
const fulfilledThenable: FulfilledThenable<mixed> = (thenable: any);
fulfilledThenable.status = "fulfilled";
fulfilledThenable.value = fulfilledValue;
}
},
(error: mixed) => {
if (thenable.status === "pending") {
const rejectedThenable: RejectedThenable<mixed> = (thenable: any);
rejectedThenable.status = "rejected";
rejectedThenable.reason = error;
}
}
);
break;
}
}
thenable.then(
(value) => {
newTask.model = value;
pingTask(request, newTask);
},
(reason) => {
newTask.status = ERRORED;
// TODO: We should ideally do this inside performWork so it's scheduled
const digest = logRecoverableError(request, reason);
emitErrorChunkProd(request, newTask.id, digest);
if (request.destination !== null) {
flushCompletedChunks(request, request.destination);
}
}
);
return newTask.id;
}
function serializePromiseID(id: number): string {
return "$@" + id.toString(16);
}

Looking at the code, it is actually similar to the Lazy chunk. We’ll see the difference on client soon.

3.7.2.6 Primitives - just as they are
js
if (typeof value === "string") {
return escapeStringValue(value);
}
if (typeof value === "boolean") {
return value;
}
if (typeof value === "number") {
return serializeNumber(value);
}
if (typeof value === "undefined") {
return serializeUndefined();
}
if (typeof value === "bigint") {
return serializeBigInt(value);
}
js
if (typeof value === "string") {
return escapeStringValue(value);
}
if (typeof value === "boolean") {
return value;
}
if (typeof value === "number") {
return serializeNumber(value);
}
if (typeof value === "undefined") {
return serializeUndefined();
}
if (typeof value === "bigint") {
return serializeBigInt(value);
}

These are just some primitive value serialization, and we’ve seen how client components and server components get serialized, nothing fancy.

3.7.2.7 Built-in Components (Symbols) - Import Chunk
js
if (typeof value === "symbol") {
const writtenSymbols = request.writtenSymbols;
const existingId = writtenSymbols.get(value);
if (existingId !== undefined) {
return serializeByValueID(existingId);
}
// $FlowFixMe[incompatible-type] `description` might be undefined
const name: string = value.description;
if (Symbol.for(name) !== value) {
throw new Error(
"Only global symbols received from Symbol.for(...) can be passed to Client Components. " +
`The symbol Symbol.for($\{
// $FlowFixMe[incompatible-type] `description` might be undefined
value.description
}) cannot be found among global symbols.` +
describeObjectForErrorMessage(parent, key)
);
}
request.pendingChunks++;
const symbolId = request.nextChunkId++;
emitSymbolChunk(request, symbolId, name);
writtenSymbols.set(value, symbolId);
return serializeByValueID(symbolId);
}
js
if (typeof value === "symbol") {
const writtenSymbols = request.writtenSymbols;
const existingId = writtenSymbols.get(value);
if (existingId !== undefined) {
return serializeByValueID(existingId);
}
// $FlowFixMe[incompatible-type] `description` might be undefined
const name: string = value.description;
if (Symbol.for(name) !== value) {
throw new Error(
"Only global symbols received from Symbol.for(...) can be passed to Client Components. " +
`The symbol Symbol.for($\{
// $FlowFixMe[incompatible-type] `description` might be undefined
value.description
}) cannot be found among global symbols.` +
describeObjectForErrorMessage(parent, key)
);
}
request.pendingChunks++;
const symbolId = request.nextChunkId++;
emitSymbolChunk(request, symbolId, name);
writtenSymbols.set(value, symbolId);
return serializeByValueID(symbolId);
}

Similar to the client component, built-in symbols generate import chunk.

3.8 Why handling thrown Promises in two locations?

We see that attemptResolveElement() is tried in retryTask() and resolveModelToJSON(), both handling thrown Promises, also resolveModelToJSON() is called in retryTask(), meaning there are nested try ... catch, why don’t we just have one?

In retryTask(), it retries the task if promise is thrown. In resolveModelToJSON(), it schedules a new task with id, and continues with the id.

Guess the difference would be what if the root element actually suspends ? Since the root task is the first chunk, is with id 0, if we handle it in resolveModelToJSON(), it actually schedules another task, with id 1. which means the initial chunk 0 is actually doing nothing.

So here it makes sure the first chunk is meaningful, this is my best guess.

3.9 How to tell if component is Client or Server Component?

We see the isClientReference() seems to expect an object, let’s see the details

js
const CLIENT_REFERENCE_TAG = Symbol.for("react.client.reference");
const SERVER_REFERENCE_TAG = Symbol.for("react.server.reference");
export function getClientReferenceKey(
reference: ClientReference<any>
): ClientReferenceKey {
return reference.$$async ? reference.$$id + "#async" : reference.$$id;
}
export function isClientReference(reference: Object): boolean {
return reference.$$typeof === CLIENT_REFERENCE_TAG;
}
js
const CLIENT_REFERENCE_TAG = Symbol.for("react.client.reference");
const SERVER_REFERENCE_TAG = Symbol.for("react.server.reference");
export function getClientReferenceKey(
reference: ClientReference<any>
): ClientReferenceKey {
return reference.$$async ? reference.$$id + "#async" : reference.$$id;
}
export function isClientReference(reference: Object): boolean {
return reference.$$typeof === CLIENT_REFERENCE_TAG;
}

So React checks if component is client module or not by looking at $$typeof. And the tag CLIENT_REFERENCE_TAG is added by overwriting the Module.prototype._compile() in node.js, since we are all talking about rendering components on server.

js
const originalCompile = Module.prototype._compile;
// $FlowFixMe[prop-missing] found when upgrading Flow
Module.prototype._compile = function (
this: any,
content: string,
filename: string,
): void {
// Do a quick check for the exact string. If it doesn't exist, don't
// bother parsing.
if (
content.indexOf('use client') === -1 &&
content.indexOf('use server') === -1
) {
return originalCompile.apply(this, arguments);
}
...
if (useClient) {
const moduleId: string = (url.pathToFileURL(filename).href: any);
const clientReference = Object.defineProperties(({}: any), {
$$typeof: {value: CLIENT_REFERENCE},
// Represents the whole Module object instead of a particular import.
$$id: {value: moduleId},
$$async: {value: false},
});
// $FlowFixMe[incompatible-call] found when upgrading Flow
this.exports = new Proxy(clientReference, proxyHandlers);
}
js
const originalCompile = Module.prototype._compile;
// $FlowFixMe[prop-missing] found when upgrading Flow
Module.prototype._compile = function (
this: any,
content: string,
filename: string,
): void {
// Do a quick check for the exact string. If it doesn't exist, don't
// bother parsing.
if (
content.indexOf('use client') === -1 &&
content.indexOf('use server') === -1
) {
return originalCompile.apply(this, arguments);
}
...
if (useClient) {
const moduleId: string = (url.pathToFileURL(filename).href: any);
const clientReference = Object.defineProperties(({}: any), {
$$typeof: {value: CLIENT_REFERENCE},
// Represents the whole Module object instead of a particular import.
$$id: {value: moduleId},
$$async: {value: false},
});
// $FlowFixMe[incompatible-call] found when upgrading Flow
this.exports = new Proxy(clientReference, proxyHandlers);
}

Module.prototype._compile() is how require() works, in above code it checks the exsitence of "use client" and modifies the exports by this.exports. Here is a simple demo that show we can alter the export on the fly. Notice there is also added $$id, which will be serialized and streamed.

This is a much better approach than ours since there is no need to build and generate different js resources for server.

3.10 flushCompletedChunks() streams down the chunks

function flushCompletedChunks(
request: Request,
destination: Destination
): void {
beginWriting(destination);
try {
// We emit module chunks first in the stream so that
// they can be preloaded as early as possible.
const importsChunks = request.completedImportChunks;

client component references

let i = 0;
for (; i < importsChunks.length; i++) {
request.pendingChunks--;
const chunk = importsChunks[i];
const keepWriting: boolean = writeChunkAndReturn(destination, chunk);
if (!keepWriting) {
request.destination = null;
i++;
break;
}
}
importsChunks.splice(0, i);
// Next comes model data.
const jsonChunks = request.completedJSONChunks;

rendered React tree in JSON

i = 0;
for (; i < jsonChunks.length; i++) {
request.pendingChunks--;
const chunk = jsonChunks[i];
const keepWriting: boolean = writeChunkAndReturn(destination, chunk);
if (!keepWriting) {
request.destination = null;
i++;
break;
}
}
jsonChunks.splice(0, i);
// Finally, errors are sent. The idea is that it's ok to delay
// any error messages and prioritize display of other parts of
// the page.
const errorChunks = request.completedErrorChunks;

errors, will skip in this post

i = 0;
for (; i < errorChunks.length; i++) {
request.pendingChunks--;
const chunk = errorChunks[i];
const keepWriting: boolean = writeChunkAndReturn(destination, chunk);
if (!keepWriting) {
request.destination = null;
i++;
break;
}
}
errorChunks.splice(0, i);
} finally {
completeWriting(destination);
}
flushBuffered(destination);
if (request.pendingChunks === 0) {
// We're done.
close(destination);
}
}
function flushCompletedChunks(
request: Request,
destination: Destination
): void {
beginWriting(destination);
try {
// We emit module chunks first in the stream so that
// they can be preloaded as early as possible.
const importsChunks = request.completedImportChunks;

client component references

let i = 0;
for (; i < importsChunks.length; i++) {
request.pendingChunks--;
const chunk = importsChunks[i];
const keepWriting: boolean = writeChunkAndReturn(destination, chunk);
if (!keepWriting) {
request.destination = null;
i++;
break;
}
}
importsChunks.splice(0, i);
// Next comes model data.
const jsonChunks = request.completedJSONChunks;

rendered React tree in JSON

i = 0;
for (; i < jsonChunks.length; i++) {
request.pendingChunks--;
const chunk = jsonChunks[i];
const keepWriting: boolean = writeChunkAndReturn(destination, chunk);
if (!keepWriting) {
request.destination = null;
i++;
break;
}
}
jsonChunks.splice(0, i);
// Finally, errors are sent. The idea is that it's ok to delay
// any error messages and prioritize display of other parts of
// the page.
const errorChunks = request.completedErrorChunks;

errors, will skip in this post

i = 0;
for (; i < errorChunks.length; i++) {
request.pendingChunks--;
const chunk = errorChunks[i];
const keepWriting: boolean = writeChunkAndReturn(destination, chunk);
if (!keepWriting) {
request.destination = null;
i++;
break;
}
}
errorChunks.splice(0, i);
} finally {
completeWriting(destination);
}
flushBuffered(destination);
if (request.pendingChunks === 0) {
// We're done.
close(destination);
}
}

3.11 Some examples of the streamed data

Phew let’s see the streamed data by real examples.

We can get the logs by modifying the test in ReactFlightDOM-test.js and then run yarn test ReactFlightDOM-test. The log point is in writeChunkAndReturn() where each chunk is written into the stream.

3.11.1 all intrinsic HTML tag

jsx
function Text({children}) {
return <span>{children}</span>;
}
function HTML() {
return (
<div>
<Text>hello</Text>
<Text>world</Text>
</div>
);
}
function App() {
const model = {
html: <HTML />,
};
jsx
function Text({children}) {
return <span>{children}</span>;
}
function HTML() {
return (
<div>
<Text>hello</Text>
<Text>world</Text>
</div>
);
}
function App() {
const model = {
html: <HTML />,
};

Below is the chunk we send.

0:{"html":["$","div",null,{"children":[["$","span",null,{"children":"hello"}],["$","span",null,{"children":"world"}]]}]}
0:{"html":["$","div",null,{"children":[["$","span",null,{"children":"hello"}],["$","span",null,{"children":"world"}]]}]}

With what we’ve learned we can easily see what they are.

Leading chunk id 0: This is the chunk identifier, meaning this is the first chunk. Since there is no async functions, there is going to be one chunk.

3.11.2 Single Server Component that suspends

jsx
function Text({children}) {
return children;
}
function makeDelayedText() {
let _resolve, _reject;
let promise = new Promise((resolve, reject) => {
_resolve = () => {
promise = null;
resolve();
};
_reject = e => {
promise = null;
reject(e);
};
});
async function DelayedText({children}) {
await promise;
return <Text>{children}</Text>;
}
return [DelayedText, _resolve, _reject];
}
const [Name, resolveName] = makeDelayedText();
const model = {
rootContent: <Name>JSer</Name>,
};
jsx
function Text({children}) {
return children;
}
function makeDelayedText() {
let _resolve, _reject;
let promise = new Promise((resolve, reject) => {
_resolve = () => {
promise = null;
resolve();
};
_reject = e => {
promise = null;
reject(e);
};
});
async function DelayedText({children}) {
await promise;
return <Text>{children}</Text>;
}
return [DelayedText, _resolve, _reject];
}
const [Name, resolveName] = makeDelayedText();
const model = {
rootContent: <Name>JSer</Name>,
};

Here are the chunks we get:

0:{"rootContent":"$L1"}
1:"JSer"
0:{"rootContent":"$L1"}
1:"JSer"
  1. in chunk:0 , $L1 means this is a reference to lazy chunk:1, since DelayedText is async Server Component, it is sent down in a seprate chunk.
  2. chunk:1 is when the promise resolves.

Obviously there needs to be some getter method on client to replace $L1 into the real chunk data.

3.11.3 Nested Server Component that suspends

jsx
const model = {
rootContent: <Name><Name>JSer</Name></Name>
};
jsx
const model = {
rootContent: <Name><Name>JSer</Name></Name>
};
0:{"rootContent":"$L1"}
1:"$L2"
2:"JSer"
0:{"rootContent":"$L1"}
1:"$L2"
2:"JSer"

Similar to above, chunk:1 has a reference to Lazy chunk:2.

3.11.4 Suspense

Similar to above example, if we have a Suspense

jsx
const model = {
rootContent: <Suspense fallback="loading..."><Name>JSer</Name></Suspense>,
};
jsx
const model = {
rootContent: <Suspense fallback="loading..."><Name>JSer</Name></Suspense>,
};

We get below stream.

1:"$Sreact.suspense"
0:{"rootContent":["$","$1",null,{"fallback":"loading...","children":"$L2"}]}
2:"JSer"
1:"$Sreact.suspense"
0:{"rootContent":["$","$1",null,{"fallback":"loading...","children":"$L2"}]}
2:"JSer"
  1. we have chunk:1 coming before chunk:0, this is because reference chunks are processed first in flushCompletedChunks
  2. chunk:1 is an reference chunk for Suspense, $Sreact.suspense comes from serializeSymbolReference().
  3. in chunk:0, $1 refers to chunk:0, with fallback and children, children refers to a lazy chunk:2
  4. later chunk:2 is generated and flushed.

3.11.5 Client Component

jsx
function Input() {
return <input />;
}
const InputClient = clientExports(Input);
const model = {
rootContent: (
<Name>
<InputClient />
<Name>JSer</Name>
</Name>
),
};
jsx
function Input() {
return <input />;
}
const InputClient = clientExports(Input);
const model = {
rootContent: (
<Name>
<InputClient />
<Name>JSer</Name>
</Name>
),
};

We got:

0:{"rootContent":"$L1"}
2:I{"id":"1","chunks":[],"name":"*","async":false}
1:[["$","$L2",null,{}],"$L3"]
3:"JSer"
0:{"rootContent":"$L1"}
2:I{"id":"1","chunks":[],"name":"*","async":false}
1:[["$","$L2",null,{}],"$L3"]
3:"JSer"

chunk:2 is for client component, it is sent earlier because we could dynamically import the JS resources earlier. it is encoded as I, which is from processImportChunk().

js
export function processImportChunk(
request: Request,
id: number,
clientReferenceMetadata: ReactClientValue,
): Chunk {
// $FlowFixMe[incompatible-type] stringify can return null
const json: string = stringify(clientReferenceMetadata);
const row = serializeRowHeader('I', id) + json + '\n';
return stringToChunk(row);
}
js
export function processImportChunk(
request: Request,
id: number,
clientReferenceMetadata: ReactClientValue,
): Chunk {
// $FlowFixMe[incompatible-type] stringify can return null
const json: string = stringify(clientReferenceMetadata);
const row = serializeRowHeader('I', id) + json + '\n';
return stringToChunk(row);
}

So it is different from Suspense, Symbol is processed by processReferenceChunk()

js
export function processReferenceChunk(
request: Request,
id: number,
reference: string,
): Chunk {
const json = stringify(reference);
const row = id.toString(16) + ':' + json + '\n';
return stringToChunk(row);
}
js
export function processReferenceChunk(
request: Request,
id: number,
reference: string,
): Chunk {
const json = stringify(reference);
const row = id.toString(16) + ':' + json + '\n';
return stringToChunk(row);
}

We’ll see the difference between these two on client soon.

Cool, we are roughly done with the streaming part, now we need to figure out the client code to see how the streamed data is consumed.

4. Client: createFromFetch()

The code below fetchs streamed data and prepares for rendering.

js
function createFromFetch<T>(
promiseForResponse: Promise<Response>,
options?: Options
): Thenable<T> {
const response: FlightResponse = createResponseFromOptions(options);
promiseForResponse.then(
function (r) {
startReadingFromStream(response, (r.body: any));
},
function (e) {
reportGlobalError(response, e);
}
);
return getRoot(response);
}
js
function createFromFetch<T>(
promiseForResponse: Promise<Response>,
options?: Options
): Thenable<T> {
const response: FlightResponse = createResponseFromOptions(options);
promiseForResponse.then(
function (r) {
startReadingFromStream(response, (r.body: any));
},
function (e) {
reportGlobalError(response, e);
}
);
return getRoot(response);
}

The stream body is processed by startReadingFromStream(), which results in FlightResponse, then consumed by getRoot(). We’ll figure it out through following questions.

  1. what is FlightResponse?
  2. how doe startReadingFromStream() work?
  3. what does getRoot() do?

4.1 What is FlightResponse?

function createResponseFromOptions(options: void | Options) {
return createResponse(
null,
options && options.callServer ? options.callServer : undefined,
);
}
export function createResponse(
bundlerConfig: SSRManifest,
callServer: void | CallServerCallback
): Response {
// NOTE: CHECK THE COMPILER OUTPUT EACH TIME YOU CHANGE THIS.
// It should be inlined to one object literal but minor changes can break it.
const stringDecoder = supportsBinaryStreams ? createStringDecoder() : null;
const response: any = createResponseBase(bundlerConfig, callServer);
response._partialRow = "";
if (supportsBinaryStreams) {
response._stringDecoder = stringDecoder;
}
// Don't inline this call because it causes closure to outline the call above.
response._fromJSON = createFromJSONCallback(response);
return response;
}
export function createResponseBase(
bundlerConfig: SSRManifest,
callServer: void | CallServerCallback
): ResponseBase {
const chunks: Map<number, SomeChunk<any>> = new Map();
const response = {
_bundlerConfig: bundlerConfig,
_callServer: callServer !== undefined ? callServer : missingCall,
_chunks: chunks,

Where the chunks are stored

};
return response;
}
function createFromJSONCallback(response: Response) {
// $FlowFixMe[missing-this-annot]
return function (key: string, value: JSONValue) {

This function is used to parse the streamed response,

working as a reviver of JSON.parse()

if (typeof value === "string") {
// We can't use .bind here because we need the "this" value.
return parseModelString(response, this, key, value);
}
if (typeof value === "object" && value !== null) {
return parseModelTuple(response, value);
}
return value;
};
}
function createResponseFromOptions(options: void | Options) {
return createResponse(
null,
options && options.callServer ? options.callServer : undefined,
);
}
export function createResponse(
bundlerConfig: SSRManifest,
callServer: void | CallServerCallback
): Response {
// NOTE: CHECK THE COMPILER OUTPUT EACH TIME YOU CHANGE THIS.
// It should be inlined to one object literal but minor changes can break it.
const stringDecoder = supportsBinaryStreams ? createStringDecoder() : null;
const response: any = createResponseBase(bundlerConfig, callServer);
response._partialRow = "";
if (supportsBinaryStreams) {
response._stringDecoder = stringDecoder;
}
// Don't inline this call because it causes closure to outline the call above.
response._fromJSON = createFromJSONCallback(response);
return response;
}
export function createResponseBase(
bundlerConfig: SSRManifest,
callServer: void | CallServerCallback
): ResponseBase {
const chunks: Map<number, SomeChunk<any>> = new Map();
const response = {
_bundlerConfig: bundlerConfig,
_callServer: callServer !== undefined ? callServer : missingCall,
_chunks: chunks,

Where the chunks are stored

};
return response;
}
function createFromJSONCallback(response: Response) {
// $FlowFixMe[missing-this-annot]
return function (key: string, value: JSONValue) {

This function is used to parse the streamed response,

working as a reviver of JSON.parse()

if (typeof value === "string") {
// We can't use .bind here because we need the "this" value.
return parseModelString(response, this, key, value);
}
if (typeof value === "object" && value !== null) {
return parseModelTuple(response, value);
}
return value;
};
}

We can see that FlightResponse is a data structure that holds the chunks in _chunks, also it has a __fromJSON method to parse the streamed response.

4.1.1 createFromJSONCallback() returns a JSON.parse() reviver to parse the streamd JSON

The returned function has arguments of key and value and parseModelString() is called inside.

export function parseModelString(
response: Response,
parentObject: Object,
key: string,
value: string
): any {
if (value[0] === "$") {
if (value === "$") {
// A very common symbol.
return REACT_ELEMENT_TYPE;
}
switch (value[1]) {
case "$": {
// This was an escaped string value.
return value.substring(1);
}
case "L": {
// Lazy node
const id = parseInt(value.substring(2), 16);
const chunk = getChunk(response, id);
// We create a React.lazy wrapper around any lazy values.
// When passed into React, we'll know how to suspend on this.
return createLazyChunkWrapper(chunk);
}
case "@": {
// Promise
const id = parseInt(value.substring(2), 16);
const chunk = getChunk(response, id);
return chunk;
}
case "S": {
// Symbol
return Symbol.for(value.substring(2));
}
case "P": {
// Server Context Provider
return getOrCreateServerContext(value.substring(2)).Provider;
}
case "F": {
// Server Reference
const id = parseInt(value.substring(2), 16);
const chunk = getChunk(response, id);
switch (chunk.status) {
case RESOLVED_MODEL:
initializeModelChunk(chunk);
break;
}
// The status might have changed after initialization.
switch (chunk.status) {
case INITIALIZED: {
const metadata = chunk.value;
return createServerReferenceProxy(response, metadata);
}
// We always encode it first in the stream so it won't be pending.
default:
throw chunk.reason;
}
}
case "I": {
// $Infinity
return Infinity;
}
case "-": {
// $-0 or $-Infinity
if (value === "$-0") {
return -0;
} else {
return -Infinity;
}
}
case "N": {
// $NaN
return NaN;
}
case "u": {
// matches "$undefined"
// Special encoding for `undefined` which can't be serialized as JSON otherwise.
return undefined;
}
case "n": {
// BigInt
return BigInt(value.substring(2));
}
default: {
// We assume that anything else is a reference ID.
const id = parseInt(value.substring(1), 16);
const chunk = getChunk(response, id);
switch (chunk.status) {
case RESOLVED_MODEL:
initializeModelChunk(chunk);
break;
case RESOLVED_MODULE:
initializeModuleChunk(chunk);
break;
}
// The status might have changed after initialization.
switch (chunk.status) {
case INITIALIZED:
return chunk.value;
case PENDING:
case BLOCKED:
const parentChunk = initializingChunk;
chunk.then(
createModelResolver(parentChunk, parentObject, key),
createModelReject(parentChunk)
);
return null;
default:
throw chunk.reason;
}
}
}
}
return value;
}
export function parseModelString(
response: Response,
parentObject: Object,
key: string,
value: string
): any {
if (value[0] === "$") {
if (value === "$") {
// A very common symbol.
return REACT_ELEMENT_TYPE;
}
switch (value[1]) {
case "$": {
// This was an escaped string value.
return value.substring(1);
}
case "L": {
// Lazy node
const id = parseInt(value.substring(2), 16);
const chunk = getChunk(response, id);
// We create a React.lazy wrapper around any lazy values.
// When passed into React, we'll know how to suspend on this.
return createLazyChunkWrapper(chunk);
}
case "@": {
// Promise
const id = parseInt(value.substring(2), 16);
const chunk = getChunk(response, id);
return chunk;
}
case "S": {
// Symbol
return Symbol.for(value.substring(2));
}
case "P": {
// Server Context Provider
return getOrCreateServerContext(value.substring(2)).Provider;
}
case "F": {
// Server Reference
const id = parseInt(value.substring(2), 16);
const chunk = getChunk(response, id);
switch (chunk.status) {
case RESOLVED_MODEL:
initializeModelChunk(chunk);
break;
}
// The status might have changed after initialization.
switch (chunk.status) {
case INITIALIZED: {
const metadata = chunk.value;
return createServerReferenceProxy(response, metadata);
}
// We always encode it first in the stream so it won't be pending.
default:
throw chunk.reason;
}
}
case "I": {
// $Infinity
return Infinity;
}
case "-": {
// $-0 or $-Infinity
if (value === "$-0") {
return -0;
} else {
return -Infinity;
}
}
case "N": {
// $NaN
return NaN;
}
case "u": {
// matches "$undefined"
// Special encoding for `undefined` which can't be serialized as JSON otherwise.
return undefined;
}
case "n": {
// BigInt
return BigInt(value.substring(2));
}
default: {
// We assume that anything else is a reference ID.
const id = parseInt(value.substring(1), 16);
const chunk = getChunk(response, id);
switch (chunk.status) {
case RESOLVED_MODEL:
initializeModelChunk(chunk);
break;
case RESOLVED_MODULE:
initializeModuleChunk(chunk);
break;
}
// The status might have changed after initialization.
switch (chunk.status) {
case INITIALIZED:
return chunk.value;
case PENDING:
case BLOCKED:
const parentChunk = initializingChunk;
chunk.then(
createModelResolver(parentChunk, parentObject, key),
createModelReject(parentChunk)
);
return null;
default:
throw chunk.reason;
}
}
}
}
return value;
}

parseModelString() works as JSON.parse reviver, which is the similar to the JSON.stringify() replacer. It is recursively called to revive the serialized data, the encoded values such as $1,$L1, $Sreact.suspense we saw before.

4.1.1.1 ”$” is revived as REACT_ELEMENT_TYPE.

Simple.

js
if (value[0] === "$") {
if (value === "$") {
// A very common symbol.
return REACT_ELEMENT_TYPE;
}
}
js
if (value[0] === "$") {
if (value === "$") {
// A very common symbol.
return REACT_ELEMENT_TYPE;
}
}
4.1.1.2 ”${id}” is replaced to the target chunk first.
default: {
// We assume that anything else is a reference ID.
const id = parseInt(value.substring(1), 16);
const chunk = getChunk(response, id);
switch (chunk.status) {
case RESOLVED_MODEL:
initializeModelChunk(chunk);
break;
case RESOLVED_MODULE:
initializeModuleChunk(chunk);
break;
}
// The status might have changed after initialization.
switch (chunk.status) {
case INITIALIZED:
return chunk.value;
case PENDING:
case BLOCKED:
const parentChunk = initializingChunk;
chunk.then(
createModelResolver(parentChunk, parentObject, key),
createModelReject(parentChunk)
);
return null;
default:
throw chunk.reason;
}
}
default: {
// We assume that anything else is a reference ID.
const id = parseInt(value.substring(1), 16);
const chunk = getChunk(response, id);
switch (chunk.status) {
case RESOLVED_MODEL:
initializeModelChunk(chunk);
break;
case RESOLVED_MODULE:
initializeModuleChunk(chunk);
break;
}
// The status might have changed after initialization.
switch (chunk.status) {
case INITIALIZED:
return chunk.value;
case PENDING:
case BLOCKED:
const parentChunk = initializingChunk;
chunk.then(
createModelResolver(parentChunk, parentObject, key),
createModelReject(parentChunk)
);
return null;
default:
throw chunk.reason;
}
}

The code needs some explanation.

getChunk() tries to get the Chunk by id, but will create a placeholder chunk(pending chunk) for future resolving. (more about Chunk).

So for the example 3.11.4 Suspense

1:"$Sreact.suspense"
0:{"rootContent":["$","$1",null,{"fallback":"loading...","children":"$L2"}]}
2:"JSer"
1:"$Sreact.suspense"
0:{"rootContent":["$","$1",null,{"fallback":"loading...","children":"$L2"}]}
2:"JSer"

getChunk(1) will be return the chunk value string as "$Sreact.suspense", which is processed again into Symbol.for("react.suspense")(see next section).

4.1.1.3 “$S{tag}” is revivied as symbols.
js
case "S": {
// Symbol
return Symbol.for(value.substring(2));
}
js
case "S": {
// Symbol
return Symbol.for(value.substring(2));
}

Simple. So "$Sreact.suspense" is replaced with Symbol.for("react.suspense").

4.1.1.4 “$L{id}” is revivied as REACT_LAZY_TYPE.
js
case "L": {
// Lazy node
const id = parseInt(value.substring(2), 16);
const chunk = getChunk(response, id);
// We create a React.lazy wrapper around any lazy values.
// When passed into React, we'll know how to suspend on this.
return createLazyChunkWrapper(chunk);
}
js
case "L": {
// Lazy node
const id = parseInt(value.substring(2), 16);
const chunk = getChunk(response, id);
// We create a React.lazy wrapper around any lazy values.
// When passed into React, we'll know how to suspend on this.
return createLazyChunkWrapper(chunk);
}

First of all, lazy chunk always comes later so getChunk() returns a pending chunk, chunk is a promise here, not fulfilled yet.

js
function createLazyChunkWrapper<T>(
chunk: SomeChunk<T>,
): LazyComponent<T, SomeChunk<T>> {
const lazyType: LazyComponent<T, SomeChunk<T>> = {
$$typeof: REACT_LAZY_TYPE,
_payload: chunk,
_init: readChunk,
};
return lazyType;
}
js
function createLazyChunkWrapper<T>(
chunk: SomeChunk<T>,
): LazyComponent<T, SomeChunk<T>> {
const lazyType: LazyComponent<T, SomeChunk<T>> = {
$$typeof: REACT_LAZY_TYPE,
_payload: chunk,
_init: readChunk,
};
return lazyType;
}

So the promise is wrapped in REACT_LAZY_TYPE, readChunk() throws the promise when promise is not resolved.

js
function readChunk<T>(chunk: SomeChunk<T>): T {
// If we have resolved content, we try to initialize it first which
// might put us back into one of the other states.
switch (chunk.status) {
case RESOLVED_MODEL:
initializeModelChunk(chunk);
break;
case RESOLVED_MODULE:
initializeModuleChunk(chunk);
break;
}
// The status might have changed after initialization.
switch (chunk.status) {
case INITIALIZED:
return chunk.value;
case PENDING:
case BLOCKED:
// eslint-disable-next-line no-throw-literal
throw ((chunk: any): Thenable<T>);
default:
throw chunk.reason;
}
}
js
function readChunk<T>(chunk: SomeChunk<T>): T {
// If we have resolved content, we try to initialize it first which
// might put us back into one of the other states.
switch (chunk.status) {
case RESOLVED_MODEL:
initializeModelChunk(chunk);
break;
case RESOLVED_MODULE:
initializeModuleChunk(chunk);
break;
}
// The status might have changed after initialization.
switch (chunk.status) {
case INITIALIZED:
return chunk.value;
case PENDING:
case BLOCKED:
// eslint-disable-next-line no-throw-literal
throw ((chunk: any): Thenable<T>);
default:
throw chunk.reason;
}
}

If we look into the rendering of REACT_LAZY_TYPE, we can see the call of init(payload) which means Lazy Chunk suspends when rendered, and will re-render when the expected chunk comes in, which is much better than our approach of force suspending, because it maintains the semantics that re-render should be in low priority.

js
case REACT_LAZY_TYPE: {
const payload = newChild._payload;
const init = newChild._init;
return createChild(returnFiber, init(payload), lanes);
}
js
case REACT_LAZY_TYPE: {
const payload = newChild._payload;
const init = newChild._init;
return createChild(returnFiber, init(payload), lanes);
}

Another question is how it works for Client Component? Since the serialization here is the same $L2. Obviously we need to handle them differently by the streamed data, this is explained in the section for processBinaryChunk.

4.2 startReadingFromStream() uses ReadableStrem reader to read the streamed data.

js
function startReadingFromStream(
response: FlightResponse,
stream: ReadableStream
): void {
const reader = stream.getReader();
function progress({
done,
value,
}: {
done: boolean,
value: ?any,
...
}): void | Promise<void> {
if (done) {
close(response);
return;
}
const buffer: Uint8Array = (value: any);
processBinaryChunk(response, buffer);
return reader.read().then(progress).catch(error);
}
function error(e: any) {
reportGlobalError(response, e);
}
reader.read().then(progress).catch(error);
}
js
function startReadingFromStream(
response: FlightResponse,
stream: ReadableStream
): void {
const reader = stream.getReader();
function progress({
done,
value,
}: {
done: boolean,
value: ?any,
...
}): void | Promise<void> {
if (done) {
close(response);
return;
}
const buffer: Uint8Array = (value: any);
processBinaryChunk(response, buffer);
return reader.read().then(progress).catch(error);
}
function error(e: any) {
reportGlobalError(response, e);
}
reader.read().then(progress).catch(error);
}

The piece of code looks similar to what we did in our demo. It basically just read the data chunk by chunk, then processBinaryChunk().

4.2.1 processBinaryChunk() process each chunk

js
export function processBinaryChunk(
response: Response,
chunk: Uint8Array,
): void {
if (!supportsBinaryStreams) {
throw new Error("This environment don't support binary chunks.");
}
const stringDecoder = response._stringDecoder;
let linebreak = chunk.indexOf(10); // newline
while (linebreak > -1) {
const fullrow =
response._partialRow +
readFinalStringChunk(stringDecoder, chunk.subarray(0, linebreak));
processFullRow(response, fullrow);
response._partialRow = '';
chunk = chunk.subarray(linebreak + 1);
linebreak = chunk.indexOf(10); // newline
}
response._partialRow += readPartialStringChunk(stringDecoder, chunk);
js
export function processBinaryChunk(
response: Response,
chunk: Uint8Array,
): void {
if (!supportsBinaryStreams) {
throw new Error("This environment don't support binary chunks.");
}
const stringDecoder = response._stringDecoder;
let linebreak = chunk.indexOf(10); // newline
while (linebreak > -1) {
const fullrow =
response._partialRow +
readFinalStringChunk(stringDecoder, chunk.subarray(0, linebreak));
processFullRow(response, fullrow);
response._partialRow = '';
chunk = chunk.subarray(linebreak + 1);
linebreak = chunk.indexOf(10); // newline
}
response._partialRow += readPartialStringChunk(stringDecoder, chunk);

It merely just decode the text and set the chunks.

function processFullRow(response: Response, row: string): void {
if (row === "") {
return;
}
const colon = row.indexOf(":", 0);
const id = parseInt(row.substring(0, colon), 16);
const tag = row[colon + 1];
// When tags that are not text are added, check them here before
// parsing the row as text.
// switch (tag) {
// }
switch (tag) {
case "I": {
resolveModule(response, id, row.substring(colon + 2));
return;
}
case "E": {
const errorInfo = JSON.parse(row.substring(colon + 2));
if (__DEV__) {
resolveErrorDev(
response,
id,
errorInfo.digest,
errorInfo.message,
errorInfo.stack
);
} else {
resolveErrorProd(response, id, errorInfo.digest);
}
return;
}
default: {
// We assume anything else is JSON.
resolveModel(response, id, row.substring(colon + 1));
return;
}
}
}
function processFullRow(response: Response, row: string): void {
if (row === "") {
return;
}
const colon = row.indexOf(":", 0);
const id = parseInt(row.substring(0, colon), 16);
const tag = row[colon + 1];
// When tags that are not text are added, check them here before
// parsing the row as text.
// switch (tag) {
// }
switch (tag) {
case "I": {
resolveModule(response, id, row.substring(colon + 2));
return;
}
case "E": {
const errorInfo = JSON.parse(row.substring(colon + 2));
if (__DEV__) {
resolveErrorDev(
response,
id,
errorInfo.digest,
errorInfo.message,
errorInfo.stack
);
} else {
resolveErrorProd(response, id, errorInfo.digest);
}
return;
}
default: {
// We assume anything else is JSON.
resolveModel(response, id, row.substring(colon + 1));
return;
}
}
}

So each chunk string is, by default, treated as model chunk, meaning React tree in JSON. It is resolved by resolveModel(). But if the value starts with flag I, then it is module chunk and resolved by resolveModule().

export function resolveModel(
response: Response,
id: number,
model: UninitializedModel
): void {
const chunks = response._chunks;
const chunk = chunks.get(id);
if (!chunk) {
chunks.set(id, createResolvedModelChunk(response, model));
} else {
resolveModelChunk(chunk, model);

resolving a promise from outside

}
}
export function resolveModel(
response: Response,
id: number,
model: UninitializedModel
): void {
const chunks = response._chunks;
const chunk = chunks.get(id);
if (!chunk) {
chunks.set(id, createResolvedModelChunk(response, model));
} else {
resolveModelChunk(chunk, model);

resolving a promise from outside

}
}

Remember a Chunk is a promise, so if the chunk is used before it is resolved, it should have suspended. Then resolving the promise causes re-render.

For resolveModule() it is slightly more complex, it needs to chain the promise of fetching the module rather than waiting for the non-existing chunk.

js
export function resolveModule(
response: Response,
id: number,
model: UninitializedModel,
): void {
const chunks = response._chunks;
const chunk = chunks.get(id);
const clientReferenceMetadata: ClientReferenceMetadata = parseModel(
response,
model,
);
const clientReference = resolveClientReference<$FlowFixMe>(
response._bundlerConfig,
clientReferenceMetadata,
);
// TODO: Add an option to encode modules that are lazy loaded.
// For now we preload all modules as early as possible since it's likely
// that we'll need them.
const promise = preloadModule(clientReference);
if (promise) {
let blockedChunk: BlockedChunk<any>;
if (!chunk) {
// Technically, we should just treat promise as the chunk in this
// case. Because it'll just behave as any other promise.
blockedChunk = createBlockedChunk(response);
chunks.set(id, blockedChunk);
} else {
// This can't actually happen because we don't have any forward
// references to modules.
blockedChunk = (chunk: any);
blockedChunk.status = BLOCKED;
}
promise.then(
() => resolveModuleChunk(blockedChunk, clientReference),
error => triggerErrorOnChunk(blockedChunk, error),
);
} else {
if (!chunk) {
chunks.set(id, createResolvedModuleChunk(response, clientReference));
} else {
// This can't actually happen because we don't have any forward
// references to modules.
resolveModuleChunk(chunk, clientReference);
}
}
}
function resolveModuleChunk<T>(
chunk: SomeChunk<T>,
value: ClientReference<T>,
): void {
if (chunk.status !== PENDING && chunk.status !== BLOCKED) {
// We already resolved. We didn't expect to see this.
return;
}
const resolveListeners = chunk.value;
const rejectListeners = chunk.reason;
const resolvedChunk: ResolvedModuleChunk<T> = (chunk: any);
resolvedChunk.status = RESOLVED_MODULE;
resolvedChunk.value = value;
if (resolveListeners !== null) {
initializeModuleChunk(resolvedChunk);
wakeChunkIfInitialized(chunk, resolveListeners, rejectListeners);
}
}
js
export function resolveModule(
response: Response,
id: number,
model: UninitializedModel,
): void {
const chunks = response._chunks;
const chunk = chunks.get(id);
const clientReferenceMetadata: ClientReferenceMetadata = parseModel(
response,
model,
);
const clientReference = resolveClientReference<$FlowFixMe>(
response._bundlerConfig,
clientReferenceMetadata,
);
// TODO: Add an option to encode modules that are lazy loaded.
// For now we preload all modules as early as possible since it's likely
// that we'll need them.
const promise = preloadModule(clientReference);
if (promise) {
let blockedChunk: BlockedChunk<any>;
if (!chunk) {
// Technically, we should just treat promise as the chunk in this
// case. Because it'll just behave as any other promise.
blockedChunk = createBlockedChunk(response);
chunks.set(id, blockedChunk);
} else {
// This can't actually happen because we don't have any forward
// references to modules.
blockedChunk = (chunk: any);
blockedChunk.status = BLOCKED;
}
promise.then(
() => resolveModuleChunk(blockedChunk, clientReference),
error => triggerErrorOnChunk(blockedChunk, error),
);
} else {
if (!chunk) {
chunks.set(id, createResolvedModuleChunk(response, clientReference));
} else {
// This can't actually happen because we don't have any forward
// references to modules.
resolveModuleChunk(chunk, clientReference);
}
}
}
function resolveModuleChunk<T>(
chunk: SomeChunk<T>,
value: ClientReference<T>,
): void {
if (chunk.status !== PENDING && chunk.status !== BLOCKED) {
// We already resolved. We didn't expect to see this.
return;
}
const resolveListeners = chunk.value;
const rejectListeners = chunk.reason;
const resolvedChunk: ResolvedModuleChunk<T> = (chunk: any);
resolvedChunk.status = RESOLVED_MODULE;
resolvedChunk.value = value;
if (resolveListeners !== null) {
initializeModuleChunk(resolvedChunk);
wakeChunkIfInitialized(chunk, resolveListeners, rejectListeners);
}
}

Quite a lot of code, but basically it just extract the module info from the chunk and have the promise by preloadModule().

refer to section 3.7.2.4 to see what is encoded in the payload.

So for $L1:

  1. if it is model chunk, it is revived as a promise that waits til its final chunk comes in
  2. if it is module chunk, it is revived as a promise that waits til its module is done preloaded

Either case they can fit in the same REACT_LAZY_TYPE data type. Awesome!

4.3. getRoot() returns the first chunk, as the initial skeleton to render

export function getRoot<T>(response: Response): Thenable<T> {
const chunk = getChunk(response, 0);

0 means the initial chunk,

which is why we said initial chunk must be meaningful

return (chunk: any);
}
function getChunk(response: Response, id: number): SomeChunk<any> {
const chunks = response._chunks;
let chunk = chunks.get(id);
if (!chunk) {
chunk = createPendingChunk(response);
chunks.set(id, chunk);
}
return chunk;
}
export function getRoot<T>(response: Response): Thenable<T> {
const chunk = getChunk(response, 0);

0 means the initial chunk,

which is why we said initial chunk must be meaningful

return (chunk: any);
}
function getChunk(response: Response, id: number): SomeChunk<any> {
const chunks = response._chunks;
let chunk = chunks.get(id);
if (!chunk) {
chunk = createPendingChunk(response);
chunks.set(id, chunk);
}
return chunk;
}

Since in theory, the first chunk( chunk with id:0, not the first chunk that comes in) is the skeleton chunk, so here getRoot() is actually getting the first chunk.

4.3.1 createPendingChunk() returns a Chunk, it suspends since Chunk extends Promise

Before the chunk is ready, createPendingChunk() creates a Promise that will be fulfilled by later streamed data.

This is much better approach than ours in the demo, because now we have a real promise chain that retains the lower priority for re-rendering. But in our demo, we use setState() which has the default rendering priority, this breaks the semantics of Suspense.

function createPendingChunk<T>(response: Response): PendingChunk<T> {
// $FlowFixMe[invalid-constructor] Flow doesn't support functions as constructors
return new Chunk(PENDING, null, null, response);
}
function Chunk(status: any, value: any, reason: any, response: Response) {
this.status = status;
this.value = value;
this.reason = reason;
this._response = response;
}
// We subclass Promise.prototype so that we get other methods like .catch
Chunk.prototype = (Object.create(Promise.prototype): any);
// TODO: This doesn't return a new Promise chain unlike the real .then
Chunk.prototype.then = function <T>(
this: SomeChunk<T>,
resolve: (value: T) => mixed,
reject: (reason: mixed) => mixed
) {
const chunk: SomeChunk<T> = this;
// If we have resolved content, we try to initialize it first which
// might put us back into one of the other states.
switch (chunk.status) {
case RESOLVED_MODEL:
initializeModelChunk(chunk);
break;
case RESOLVED_MODULE:
initializeModuleChunk(chunk);
break;
}
// The status might have changed after initialization.
switch (chunk.status) {
case INITIALIZED:
resolve(chunk.value);
break;
case PENDING:
case BLOCKED:
if (resolve) {
if (chunk.value === null) {
chunk.value = ([]: Array<(T) => mixed>);
}
chunk.value.push(resolve);

exposing resolver

}
if (reject) {
if (chunk.reason === null) {
chunk.reason = ([]: Array<(mixed) => mixed>);
}
chunk.reason.push(reject);
}
break;
default:
reject(chunk.reason);
break;
}
};
function createPendingChunk<T>(response: Response): PendingChunk<T> {
// $FlowFixMe[invalid-constructor] Flow doesn't support functions as constructors
return new Chunk(PENDING, null, null, response);
}
function Chunk(status: any, value: any, reason: any, response: Response) {
this.status = status;
this.value = value;
this.reason = reason;
this._response = response;
}
// We subclass Promise.prototype so that we get other methods like .catch
Chunk.prototype = (Object.create(Promise.prototype): any);
// TODO: This doesn't return a new Promise chain unlike the real .then
Chunk.prototype.then = function <T>(
this: SomeChunk<T>,
resolve: (value: T) => mixed,
reject: (reason: mixed) => mixed
) {
const chunk: SomeChunk<T> = this;
// If we have resolved content, we try to initialize it first which
// might put us back into one of the other states.
switch (chunk.status) {
case RESOLVED_MODEL:
initializeModelChunk(chunk);
break;
case RESOLVED_MODULE:
initializeModuleChunk(chunk);
break;
}
// The status might have changed after initialization.
switch (chunk.status) {
case INITIALIZED:
resolve(chunk.value);
break;
case PENDING:
case BLOCKED:
if (resolve) {
if (chunk.value === null) {
chunk.value = ([]: Array<(T) => mixed>);
}
chunk.value.push(resolve);

exposing resolver

}
if (reject) {
if (chunk.reason === null) {
chunk.reason = ([]: Array<(mixed) => mixed>);
}
chunk.reason.push(reject);
}
break;
default:
reject(chunk.reason);
break;
}
};

This is a really hacky. It intercept the .then() call and put the resolver resolve() into the chunk.value. Maybe the proposal of Promise.withResolvers() could improve the code.

js
export function resolveModel(
response: Response,
id: number,
model: UninitializedModel
): void {
const chunks = response._chunks;
const chunk = chunks.get(id);
if (!chunk) {
chunks.set(id, createResolvedModelChunk(response, model));
} else {
resolveModelChunk(chunk, model);
}
}
js
export function resolveModel(
response: Response,
id: number,
model: UninitializedModel
): void {
const chunks = response._chunks;
const chunk = chunks.get(id);
if (!chunk) {
chunks.set(id, createResolvedModelChunk(response, model));
} else {
resolveModelChunk(chunk, model);
}
}

In resolveModel(), since the pending chunk is already there, the chunk will be resolved.

function resolveModelChunk<T>(
chunk: SomeChunk<T>,
value: UninitializedModel
): void {
if (chunk.status !== PENDING) {
// We already resolved. We didn't expect to see this.
return;
}
const resolveListeners = chunk.value;
------------------------------------
const rejectListeners = chunk.reason;
const resolvedChunk: ResolvedModelChunk<T> = (chunk: any);
resolvedChunk.status = RESOLVED_MODEL;
resolvedChunk.value = value;
if (resolveListeners !== null) {
// This is unfortunate that we're reading this eagerly if
// we already have listeners attached since they might no
// longer be rendered or might not be the highest pri.
initializeModelChunk(resolvedChunk);
// The status might have changed after initialization.
wakeChunkIfInitialized(chunk, resolveListeners, rejectListeners);
----------------------------------------------------------------
}
}
function wakeChunkIfInitialized<T>(
chunk: SomeChunk<T>,
resolveListeners: Array<(T) => mixed>,
rejectListeners: null | Array<(mixed) => mixed>,
): void {
switch (chunk.status) {
case INITIALIZED:
wakeChunk(resolveListeners, chunk.value);
----------------------------------------
break;
case PENDING:
case BLOCKED:
chunk.value = resolveListeners;
chunk.reason = rejectListeners;
break;
case ERRORED:
if (rejectListeners) {
wakeChunk(rejectListeners, chunk.reason);
}
break;
}
}
function wakeChunk<T>(listeners: Array<(T) => mixed>, value: T): void {
for (let i = 0; i < listeners.length; i++) {
const listener = listeners[i];
listener(value);
---------------
}
}
function resolveModelChunk<T>(
chunk: SomeChunk<T>,
value: UninitializedModel
): void {
if (chunk.status !== PENDING) {
// We already resolved. We didn't expect to see this.
return;
}
const resolveListeners = chunk.value;
------------------------------------
const rejectListeners = chunk.reason;
const resolvedChunk: ResolvedModelChunk<T> = (chunk: any);
resolvedChunk.status = RESOLVED_MODEL;
resolvedChunk.value = value;
if (resolveListeners !== null) {
// This is unfortunate that we're reading this eagerly if
// we already have listeners attached since they might no
// longer be rendered or might not be the highest pri.
initializeModelChunk(resolvedChunk);
// The status might have changed after initialization.
wakeChunkIfInitialized(chunk, resolveListeners, rejectListeners);
----------------------------------------------------------------
}
}
function wakeChunkIfInitialized<T>(
chunk: SomeChunk<T>,
resolveListeners: Array<(T) => mixed>,
rejectListeners: null | Array<(mixed) => mixed>,
): void {
switch (chunk.status) {
case INITIALIZED:
wakeChunk(resolveListeners, chunk.value);
----------------------------------------
break;
case PENDING:
case BLOCKED:
chunk.value = resolveListeners;
chunk.reason = rejectListeners;
break;
case ERRORED:
if (rejectListeners) {
wakeChunk(rejectListeners, chunk.reason);
}
break;
}
}
function wakeChunk<T>(listeners: Array<(T) => mixed>, value: T): void {
for (let i = 0; i < listeners.length; i++) {
const listener = listeners[i];
listener(value);
---------------
}
}

OK getRoot() returns the pending chunk which is actually a promise. Then it is rendered by use()

jsx
return (
<RouterContext.Provider>
{use(content)}
</RouterContext.Provider>
);
jsx
return (
<RouterContext.Provider>
{use(content)}
</RouterContext.Provider>
);

5. use() hook to render promise.

use() is a new hook that allows us to render a promise.

function use<T>(usable: Usable<T>): T {
if (usable !== null && typeof usable === "object") {
// $FlowFixMe[method-unbinding]
if (typeof usable.then === "function") {
// This is a thenable.
const thenable: Thenable<T> = (usable: any);
// Track the position of the thenable within this fiber.
const index = thenableIndexCounter;
thenableIndexCounter += 1;
if (thenableState === null) {
thenableState = createThenableState();
}
const result = trackUsedThenable(thenableState, thenable, index);
if (
currentlyRenderingFiber.alternate === null &&
(workInProgressHook === null
? currentlyRenderingFiber.memoizedState === null
: workInProgressHook.next === null)
) {
// Initial render, and either this is the first time the component is
// called, or there were no Hooks called after this use() the previous
// time (perhaps because it threw). Subsequent Hook calls should use the
// mount dispatcher.
if (__DEV__) {
ReactCurrentDispatcher.current = HooksDispatcherOnMountInDEV;
} else {
ReactCurrentDispatcher.current = HooksDispatcherOnMount;
}
}
return result;
} else if (
usable.$$typeof === REACT_CONTEXT_TYPE ||
usable.$$typeof === REACT_SERVER_CONTEXT_TYPE
) {
const context: ReactContext<T> = (usable: any);
return readContext(context);
}
}
// eslint-disable-next-line react-internal/safe-string-coercion
throw new Error("An unsupported type was passed to use(): " + String(usable));
}
function use<T>(usable: Usable<T>): T {
if (usable !== null && typeof usable === "object") {
// $FlowFixMe[method-unbinding]
if (typeof usable.then === "function") {
// This is a thenable.
const thenable: Thenable<T> = (usable: any);
// Track the position of the thenable within this fiber.
const index = thenableIndexCounter;
thenableIndexCounter += 1;
if (thenableState === null) {
thenableState = createThenableState();
}
const result = trackUsedThenable(thenableState, thenable, index);
if (
currentlyRenderingFiber.alternate === null &&
(workInProgressHook === null
? currentlyRenderingFiber.memoizedState === null
: workInProgressHook.next === null)
) {
// Initial render, and either this is the first time the component is
// called, or there were no Hooks called after this use() the previous
// time (perhaps because it threw). Subsequent Hook calls should use the
// mount dispatcher.
if (__DEV__) {
ReactCurrentDispatcher.current = HooksDispatcherOnMountInDEV;
} else {
ReactCurrentDispatcher.current = HooksDispatcherOnMount;
}
}
return result;
} else if (
usable.$$typeof === REACT_CONTEXT_TYPE ||
usable.$$typeof === REACT_SERVER_CONTEXT_TYPE
) {
const context: ReactContext<T> = (usable: any);
return readContext(context);
}
}
// eslint-disable-next-line react-internal/safe-string-coercion
throw new Error("An unsupported type was passed to use(): " + String(usable));
}

We’ll dig into details in following episodes, but here just know that it Suspends when the promie is not fulfilled, we can find the throw in trackUsedThenable().

export function trackUsedThenable<T>(
thenableState: ThenableState,
thenable: Thenable<T>,
index: number
): T {
if (__DEV__ && ReactCurrentActQueue.current !== null) {
ReactCurrentActQueue.didUsePromise = true;
}
const previous = thenableState[index];
if (previous === undefined) {
thenableState.push(thenable);
} else {
if (previous !== thenable) {
// Reuse the previous thenable, and drop the new one. We can assume
// they represent the same value, because components are idempotent.
// Avoid an unhandled rejection errors for the Promises that we'll
// intentionally ignore.
thenable.then(noop, noop);
thenable = previous;
}
}
// We use an expando to track the status and result of a thenable so that we
// can synchronously unwrap the value. Think of this as an extension of the
// Promise API, or a custom interface that is a superset of Thenable.
//
// If the thenable doesn't have a status, set it to "pending" and attach
// a listener that will update its status and result when it resolves.
switch (thenable.status) {
case "fulfilled": {
const fulfilledValue: T = thenable.value;
return fulfilledValue;
}
case "rejected": {
const rejectedError = thenable.reason;
throw rejectedError;
}
default: {
if (typeof thenable.status === "string") {
// Only instrument the thenable if the status if not defined. If
// it's defined, but an unknown value, assume it's been instrumented by
// some custom userspace implementation. We treat it as "pending".
// Attach a dummy listener, to ensure that any lazy initialization can
// happen. Flight lazily parses JSON when the value is actually awaited.
thenable.then(noop, noop);
} else {
const pendingThenable: PendingThenable<T> = (thenable: any);
pendingThenable.status = "pending";
pendingThenable.then(
(fulfilledValue) => {
if (thenable.status === "pending") {
const fulfilledThenable: FulfilledThenable<T> = (thenable: any);
fulfilledThenable.status = "fulfilled";
fulfilledThenable.value = fulfilledValue;
}
},
(error: mixed) => {
if (thenable.status === "pending") {
const rejectedThenable: RejectedThenable<T> = (thenable: any);
rejectedThenable.status = "rejected";
rejectedThenable.reason = error;
}
}
);
}
// Check one more time in case the thenable resolved synchronously.
switch (thenable.status) {
case "fulfilled": {
const fulfilledThenable: FulfilledThenable<T> = (thenable: any);
return fulfilledThenable.value;
}
case "rejected": {
const rejectedThenable: RejectedThenable<T> = (thenable: any);
throw rejectedThenable.reason;
}
}
// Suspend.
//
// Throwing here is an implementation detail that allows us to unwind the
// call stack. But we shouldn't allow it to leak into userspace. Throw an
// opaque placeholder value instead of the actual thenable. If it doesn't
// get captured by the work loop, log a warning, because that means
// something in userspace must have caught it.
suspendedThenable = thenable;
if (__DEV__) {
needsToResetSuspendedThenableDEV = true;
}
throw SuspenseException;
}
}
}
export function trackUsedThenable<T>(
thenableState: ThenableState,
thenable: Thenable<T>,
index: number
): T {
if (__DEV__ && ReactCurrentActQueue.current !== null) {
ReactCurrentActQueue.didUsePromise = true;
}
const previous = thenableState[index];
if (previous === undefined) {
thenableState.push(thenable);
} else {
if (previous !== thenable) {
// Reuse the previous thenable, and drop the new one. We can assume
// they represent the same value, because components are idempotent.
// Avoid an unhandled rejection errors for the Promises that we'll
// intentionally ignore.
thenable.then(noop, noop);
thenable = previous;
}
}
// We use an expando to track the status and result of a thenable so that we
// can synchronously unwrap the value. Think of this as an extension of the
// Promise API, or a custom interface that is a superset of Thenable.
//
// If the thenable doesn't have a status, set it to "pending" and attach
// a listener that will update its status and result when it resolves.
switch (thenable.status) {
case "fulfilled": {
const fulfilledValue: T = thenable.value;
return fulfilledValue;
}
case "rejected": {
const rejectedError = thenable.reason;
throw rejectedError;
}
default: {
if (typeof thenable.status === "string") {
// Only instrument the thenable if the status if not defined. If
// it's defined, but an unknown value, assume it's been instrumented by
// some custom userspace implementation. We treat it as "pending".
// Attach a dummy listener, to ensure that any lazy initialization can
// happen. Flight lazily parses JSON when the value is actually awaited.
thenable.then(noop, noop);
} else {
const pendingThenable: PendingThenable<T> = (thenable: any);
pendingThenable.status = "pending";
pendingThenable.then(
(fulfilledValue) => {
if (thenable.status === "pending") {
const fulfilledThenable: FulfilledThenable<T> = (thenable: any);
fulfilledThenable.status = "fulfilled";
fulfilledThenable.value = fulfilledValue;
}
},
(error: mixed) => {
if (thenable.status === "pending") {
const rejectedThenable: RejectedThenable<T> = (thenable: any);
rejectedThenable.status = "rejected";
rejectedThenable.reason = error;
}
}
);
}
// Check one more time in case the thenable resolved synchronously.
switch (thenable.status) {
case "fulfilled": {
const fulfilledThenable: FulfilledThenable<T> = (thenable: any);
return fulfilledThenable.value;
}
case "rejected": {
const rejectedThenable: RejectedThenable<T> = (thenable: any);
throw rejectedThenable.reason;
}
}
// Suspend.
//
// Throwing here is an implementation detail that allows us to unwind the
// call stack. But we shouldn't allow it to leak into userspace. Throw an
// opaque placeholder value instead of the actual thenable. If it doesn't
// get captured by the work loop, log a warning, because that means
// something in userspace must have caught it.
suspendedThenable = thenable;
if (__DEV__) {
needsToResetSuspendedThenableDEV = true;
}
throw SuspenseException;
}
}
}

6. Summary

Phew, this episode is very long and very interesting to dig into, allow me to do a brief summary.

  1. when components get imported on server, they are tagged to indicate if they are client or server components.
  2. when rendering React tree on Server
    • A JSON.stringify() replacer lazily renders and serialize the tree
    • Result of serialization is streamed by Chunk, identified by auto-incrementing id, also Chunk have types
      • model chunk: a JSON chunk of serialized React tree
      • module chunk: a chunk of module config so that runtime could load the module
      • reference chunk(symbol chunk): a chunk for the built-in component symbols.
    • For Server Component that returns a Promise and don’t get fulfilled right away
      • A new Task is scheduled to continue rendering of its subtree in a new chunk when promise is resolved
      • A lazy chunk identifier $L{chunk id} is streamed down first.
    • For Client Component
      • A chunk with the client component reference(path, name .etc) is streamed
      • A lazy chunk identifier $L{chunk id} is streamed down.
    • For Symbols .etc
      • serialize by the tag. $S{tag}
  3. upon receiving the streamed data on Client
    • ReadableStream reader is used to read the streamed data, into FlightResponse
    • FlightResponse holds all the processed chunks
    • each chunk row is processed based on the types, and a Chunk on client is created with status and value
      • Chunk extends Promise by exposing resolvers and value on the instance directly, so they can be resolved later and externally.
      • For module chunk, a Promise to load the module is created
      • For model chunk, a JSON.parse() reviver is responsible to revive all the encoded data
        • For $L{chunk id}
          • if chunk is module chunk, create a pending chunk to load the module
          • if chunk is model chunk, create a pending Chunk to wait for the chunk
          • either way, it is renderd with the same element type ‘REACT_LAZY_TYPE’, which throws if the promise is not resolvd, thus suspends also re-renders when promise is resolved, just like how Suspense should work
        • For ${chunk id}, just get the chunk from FlightResponse by id
        • For $S{tag name}, just use the global Symbols.
  4. when render the chunks
    • get the chunk:0 from all the chunks, because the initial chunk should be the skeleton for all the tree
    • use() hook helps render Promise, it throws if the Promise is not resolved.
    • following lazy chunks and preloaded module chunks automatically triggers re-render, in low priority, as they already do.

How RSC works.

(1/20)

Based on what we’ve learned here, I’d say our naive approach is kind of on the right direction but didn’t do right in the actual implementation, especially on the client:

  1. we didn’t maintain the correct semantics on re-render of promise resolving
  2. we didn’t preload the Client Components
  3. we revive the symbols at element level, but we could just replace the symbols.
  4. we didn’t have a cache for chunks and it is not performant in merging model chunks.

For Server side, we had a very rough implementation that

  1. we didn’t handle manual thrown Promises
  2. our approach could run into max callstack limitation
  3. we didn’t enforce the initial chunk to be meaningful

Overall, I’ll give our previous demo a score of 3, out of 10. How do you think?

Alright, hope this helps you understand RSC better. See you in next episodes.

Want to know more about how React works internally?
Check out my series - React Internals Deep Dive!

😳 Would you like to share my post to more people ?    

❮ Prev: My guess at how React Server Component(RSC) works internally

Next: How does useId() work internally in React?