Skip to content

feat: Implement responseJSONSchema option support #34

@tak0m0

Description

@tak0m0

Summary

The LanguageModelPromptOptions interface defines a responseJSONSchema option, but it is not currently implemented in the internal logic.

// interfaces.ts (current)
export interface LanguageModelPromptOptions {
  responseJSONSchema?: object;  // ← Defined but not implemented
  requestUUID?: string;
  timeout?: number;
}

Motivation

Structured output (JSON Schema-based grammar) is a common requirement for LLM applications. It ensures the model's response conforms to a specific schema, making it easier to parse and validate responses programmatically.

node-llama-cpp supports this feature via llama.createGrammarForJsonSchema(schema), which can be leveraged to implement this option.

Proposed Implementation

In the prompt() method of LanguageModel class, check for responseJSONSchema option and create a grammar:

async prompt(
  input: LanguageModelPrompt | LanguageModelPrompt[],
  options?: InternalLanguageModelPromptOptions,
): Promise<string> {
  // ... existing validation ...

  const promptOptions: Parameters<typeof this.session.prompt>[1] = {
    temperature: this.temperature,
    signal: options?.signal,
    stopOnAbortSignal: true,
    topK: this.topK,
  };

  if (options?.responseJSONSchema) {
    const llamaCpp = await getLlamaCpp();
    const llama = await llamaCpp.getLlama();
    const grammar = await llama.createGrammarForJsonSchema(
      options.responseJSONSchema,
    );
    promptOptions.grammar = grammar;
  }

  const response = await this.session.prompt(processedInput, promptOptions);
  return response;
}

Type Consideration

There is a type compatibility issue between the responseJSONSchema option (currently typed as object) and node-llama-cpp's createGrammarForJsonSchema() which expects GbnfJsonSchema.

node-llama-cpp uses const generics for compile-time type inference:

createGrammarForJsonSchema<const T extends GbnfJsonSchema<Defs>, ...>(
  schema: Readonly<T> & GbnfJsonSchema<Defs>
): Promise<LlamaJsonSchemaGrammar<T, Defs>>;

This works well when the schema is defined inline with as const, but when receiving the schema dynamically (as in this use case), a type assertion is required.

Options:

  1. Update interfaces.ts to use GbnfJsonSchema type from node-llama-cpp (adds dependency)
  2. Keep object type and use type assertion in implementation
  3. Use a more permissive type like Record<string, unknown>

Expected Behavior

const schema = {
  type: 'object',
  properties: {
    name: { type: 'string' },
    age: { type: 'number' }
  },
} as const;

const response = await model.prompt('Return a person object', {
  responseJSONSchema: schema
});

// response is guaranteed to be valid JSON matching the schema
const parsed = JSON.parse(response);

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions