Replies: 1 comment
-
I've found a solution that is still pretty hacky but is statisfactory for me now. I create a custom client with custom CallOptions. The custom CallOptions has a value for the fetch cache. In the client i read this option and set a faux header on the request. In an injector I read this header and set the actual fetch option for caching. This way the logic is cross-cutting across the application, but it is still dirty that it requires a fake http header to piggyback on. client.ts: import { createNextClient } from "./promise-client";
import { createConnectTransport } from "@bufbuild/connect-web";
import { Interceptor } from "@bufbuild/connect";
// Import service definition that you want to connect to.
import { QueryService } from "@buf/crewlinker_nextback.bufbuild_connect-es/protoapi/v1/api_connect";
import { CommandService } from "@buf/crewlinker_nextback.bufbuild_connect-es/protoapi/v1/api_connect";
// define a faux header that we use to transport cache settings from the custom client
// to an injector.
export const FetchCachePiggybackHeader = "X-FETCH-CACHE";
// cacher interceptor will modify the client-side caching behaviour for the
// fetch call by inspecting the request message.
const cacher: Interceptor = (next) => async (req) => {
// the biggyback header for our fetch caching needs to be checked
// so we can influence the caching behaviour from our call site.
const fetchCache = req.header.get(FetchCachePiggybackHeader);
if (fetchCache !== null) {
switch (fetchCache) {
case "default":
case "force-cache":
case "no-cache":
case "no-store":
case "only-if-cached":
case "reload":
req.init.cache = fetchCache;
break;
}
req.header.delete(FetchCachePiggybackHeader);
}
return await next(req);
};
// The transport defines what type of endpoint we're hitting.
// In our example we'll be communicating with a Connect endpoint.
const transport = createConnectTransport({
baseUrl: process.env.PROTO_API_V1_BASE_URL!,
useHttpGet: true,
interceptors: [cacher],
});
// Export the clients.
export const query = createNextClient(QueryService, transport);
export const command = createNextClient(CommandService, transport); promise-client.ts import {
MethodInfo,
MethodInfoBiDiStreaming,
MethodInfoClientStreaming,
MethodInfoServerStreaming,
MethodInfoUnary,
PartialMessage,
ServiceType,
Message,
MethodKind,
} from "@bufbuild/protobuf";
import { createAsyncIterable } from "@bufbuild/connect/protocol";
import type { Transport, CallOptions } from "@bufbuild/connect";
import { makeAnyClient, ConnectError, Code } from "@bufbuild/connect";
import { FetchCachePiggybackHeader } from "./client";
// NextCallOptions exposes extra options to the call-site.
export interface NextCallOptions extends CallOptions {
fetchCache?: RequestCache;
}
// we need to pass our custom call option an injector so it can modify
// raw fetch properties. There currently doesn't seem to be a way to pass arbitrary
// data through transport.Unary(...) so we piggyback on header.
function piggybackFetchCache(options?: NextCallOptions) {
if (options && options.fetchCache) {
const headers = new Headers(options.headers);
headers.append(FetchCachePiggybackHeader, options.fetchCache);
options.headers = headers;
}
}
/**
* NextClient is a promise client that supports unary and server-streaming
* methods while also provide NextJS specific fetch options. Methods will produce a promise
* for the response message, or an asynchronous iterable of response messages.
* The implementation is copies mostly from the reference implementation mentioned here
* https://connectrpc.com/docs/web/using-clients#roll-your-own-client
*/
export type NextClient<T extends ServiceType> = {
[P in keyof T["methods"]]: T["methods"][P] extends MethodInfoUnary<
infer I,
infer O
>
? (request: PartialMessage<I>, options?: NextCallOptions) => Promise<O>
: T["methods"][P] extends MethodInfoServerStreaming<infer I, infer O>
? (
request: PartialMessage<I>,
options?: NextCallOptions,
) => AsyncIterable<O>
: T["methods"][P] extends MethodInfoClientStreaming<infer I, infer O>
? (
request: AsyncIterable<PartialMessage<I>>,
options?: NextCallOptions,
) => Promise<O>
: T["methods"][P] extends MethodInfoBiDiStreaming<infer I, infer O>
? (
request: AsyncIterable<PartialMessage<I>>,
options?: NextCallOptions,
) => AsyncIterable<O>
: never;
};
/**
* Create a NextJS Client for the given service, invoking RPCs through the
* given transport while allowing next specific fetch options to be specified.
*/
export function createNextClient<T extends ServiceType>(
service: T,
transport: Transport,
) {
return makeAnyClient(service, (method) => {
switch (method.kind) {
case MethodKind.Unary:
return createUnaryFn(transport, service, method);
case MethodKind.ServerStreaming:
return createServerStreamingFn(transport, service, method);
case MethodKind.ClientStreaming:
return createClientStreamingFn(transport, service, method);
case MethodKind.BiDiStreaming:
return createBiDiStreamingFn(transport, service, method);
default:
return null;
}
}) as NextClient<T>;
}
/**
* UnaryFn is the method signature for a unary method of a PromiseClient.
*/
type UnaryFn<I extends Message<I>, O extends Message<O>> = (
request: PartialMessage<I>,
options?: NextCallOptions,
) => Promise<O>;
function createUnaryFn<I extends Message<I>, O extends Message<O>>(
transport: Transport,
service: ServiceType,
method: MethodInfo<I, O>,
): UnaryFn<I, O> {
return async function (input, options) {
piggybackFetchCache(options);
const response = await transport.unary(
service,
method,
options?.signal,
options?.timeoutMs,
options?.headers,
input,
);
options?.onHeader?.(response.header);
options?.onTrailer?.(response.trailer);
return response.message;
};
}
/**
* ServerStreamingFn is the method signature for a server-streaming method of
* a PromiseClient.
*/
type ServerStreamingFn<I extends Message<I>, O extends Message<O>> = (
request: PartialMessage<I>,
options?: NextCallOptions,
) => AsyncIterable<O>;
export function createServerStreamingFn<
I extends Message<I>,
O extends Message<O>,
>(
transport: Transport,
service: ServiceType,
method: MethodInfo<I, O>,
): ServerStreamingFn<I, O> {
return async function* (input, options): AsyncIterable<O> {
const inputMessage =
input instanceof method.I ? input : new method.I(input);
piggybackFetchCache(options);
const response = await transport.stream<I, O>(
service,
method,
options?.signal,
options?.timeoutMs,
options?.headers,
createAsyncIterable([inputMessage]),
);
options?.onHeader?.(response.header);
yield* response.message;
options?.onTrailer?.(response.trailer);
};
}
/**
* ClientStreamFn is the method signature for a client streaming method of a
* PromiseClient.
*/
type ClientStreamingFn<I extends Message<I>, O extends Message<O>> = (
request: AsyncIterable<PartialMessage<I>>,
options?: NextCallOptions,
) => Promise<O>;
export function createClientStreamingFn<
I extends Message<I>,
O extends Message<O>,
>(
transport: Transport,
service: ServiceType,
method: MethodInfo<I, O>,
): ClientStreamingFn<I, O> {
return async function (
request: AsyncIterable<PartialMessage<I>>,
options?: NextCallOptions,
): Promise<O> {
async function* input() {
for await (const partial of request) {
yield partial instanceof method.I ? partial : new method.I(partial);
}
}
piggybackFetchCache(options);
const response = await transport.stream<I, O>(
service,
method,
options?.signal,
options?.timeoutMs,
options?.headers,
input(),
);
options?.onHeader?.(response.header);
let singleMessage: O | undefined;
for await (const message of response.message) {
singleMessage = message;
}
if (!singleMessage) {
throw new ConnectError(
"protocol error: missing response message",
Code.Internal,
);
}
options?.onTrailer?.(response.trailer);
return singleMessage;
};
}
/**
* BiDiStreamFn is the method signature for a bi-directional streaming method
* of a PromiseClient.
*/
type BiDiStreamingFn<I extends Message<I>, O extends Message<O>> = (
request: AsyncIterable<PartialMessage<I>>,
options?: NextCallOptions,
) => AsyncIterable<O>;
export function createBiDiStreamingFn<
I extends Message<I>,
O extends Message<O>,
>(
transport: Transport,
service: ServiceType,
method: MethodInfo<I, O>,
): BiDiStreamingFn<I, O> {
return async function* (
request: AsyncIterable<PartialMessage<I>>,
options?: NextCallOptions,
): AsyncIterable<O> {
async function* input() {
for await (const partial of request) {
yield partial instanceof method.I ? partial : new method.I(partial);
}
}
piggybackFetchCache(options);
const response = await transport.stream<I, O>(
service,
method,
options?.signal,
options?.timeoutMs,
options?.headers,
input(),
);
options?.onHeader?.(response.header);
yield* response.message;
options?.onTrailer?.(response.trailer);
};
} Now, in the application code I can call my roc like this: Page.tsx import Link from "next/link";
import { query } from "@protoapi/client";
export default async function Page() {
const res = await query.debugHealth({}, { fetchCache: "no-store" });
return (
<div>
<p>Posts: {JSON.stringify(res)}</p>
<Link href={"/vacancy"}>VACANCY</Link>
<Link href={"/"}>HOME</Link>
</div>
);
} |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm using the connect-web package in a Nextjs project. Nextjs recommends customising the fetch call's cache on a per-call basis but the generated code from connect-web doesn't seem to allow for customising this? As an example of what I would hope is possible:
And I call it in app/page.tsx like so:
Since 0.9.0 it is possible to declare a custom fetch and I can do some customisation there but what would be the correct way of passing options from the my
query.webBlogIndex
call-site to my custom fetch? Some injectors?Maybe someone can help me?
NOTE: this is somewhat Nextjs specific but I would imagine customising fetch's properties from the call-site might be common for other projects as well
EDIT: I've tried creating a custom client but even then I'm running into the
transport.unary
call not being able to pass extra data that I can use in my custom fetch passed tocreateConnectTransport
. For my use case it would be acceptable if I don't use a connectTransport at all and just use a fetch transport in the client, not sure if that would help?EDIT2: Another approach I've tried is to define a new field on the Request protobuf message that describes the caching behaviour. This field can then be checked at runtime on the client (in an interceptor) to set the "cache" key on the fetch. But this forces developers to add this field to every message on the server, and server aguably shouldn't even be aware of this behaviour.
EDIT3: With a custom client, that accepts custom CallOptions I'm almost there (see below). But I once again am not able to pass the caching option past the transport, into the fetch call.
Beta Was this translation helpful? Give feedback.
All reactions