PromptStream.AI — Token-aware prompt composition, validation, and conversational context toolkit for .NET.
Built atop Flow.AI.Core and TokenFlow.AI,
PromptStream.AI enables developers to compose, validate, generate, and manage multi-turn AI prompts with token budgeting, interpolation, and contextual memory.
- 🧩 Token-aware prompt builder with variable interpolation (
{{variable}}syntax) - ✅ Validation engine for token limits, structure, and completeness
- 💬 Shared Core Models from Flow.AI.Core (
PromptTemplate,PromptInstance,PromptMessage,PromptResponse) - 🧠 Context manager with replay, merge, summarization, and JSON persistence
- 💾 Persistent context storage (
ToJson/LoadFromJson) - 🧮 Token budgeting tools (
EstimateTokenUsage,TrimToTokenBudget) - ⚡ CLI utility (
PromptStream.AI.CLI) for building, validating, analyzing, and generating prompts - 🔌 Seamless integration with TokenFlow.AI for model-aware tokenization
dotnet add package PromptStream.AIRequires:
- .NET 8.0 or higher
- Flow.AI.Core v0.2.0+
- (optional) TokenFlow.AI for advanced token metrics
using System;
using System.Collections.Generic;
using Flow.AI.Core.Models;
using Flow.AI.Core.Interfaces;
using TokenFlow.AI.Integration;
using PromptStream.AI.Services;
// Initialize the service with token tracking
var tokenProvider = new BasicTokenFlowProvider();
var modelClient = new TokenFlowModelClient("gpt-4o-mini");
var context = new PromptContextManager();
var service = new PromptStreamService(tokenProvider, context, modelClient);
// Define a shared Core template
var template = new PromptTemplate
{
Id = "summarize-v1",
Template = "Summarize the following:\n\n{{input}}\n\nBe concise.",
RequiredVariables = new() { "input" }
};
// Variables to inject
var variables = new Dictionary<string, string>
{
["input"] = "Flow.AI enables composable AI workflows for .NET developers."
};
// Build and validate
var (instance, validation) = service.BuildAndValidate(template, variables);
if (validation.IsValid)
{
Console.WriteLine($"✅ Valid prompt ({validation.TokenCount} tokens)");
Console.WriteLine(instance.RenderedText);
}
else
{
Console.WriteLine($"❌ Invalid: {string.Join(", ", validation.Errors)}");
}
// Add a user message to the context
context.AddMessage(new PromptMessage { Role = "user", Content = instance.RenderedText });PromptStream.AI includes a full command-line interface for developers to build, validate, analyze, and generate prompts directly from the terminal.
dotnet run --project src/PromptStream.AI.CLI/PromptStream.AI.CLI.csproj -- build --template "Hello {{name}}" --var name=Andrewdotnet run --project src/PromptStream.AI.CLI/PromptStream.AI.CLI.csproj -- validate --template "Summarize {{topic}}" --var topic="AI in .NET"dotnet run --project src/PromptStream.AI.CLI/PromptStream.AI.CLI.csproj -- generate --template "Explain {{concept}}" --var concept="tokenization" --save context.jsondotnet run --project src/PromptStream.AI.CLI/PromptStream.AI.CLI.csproj -- context --load context.json --summarizedotnet run --project src/PromptStream.AI.CLI/PromptStream.AI.CLI.csproj -- analyze --template "Summarize {{topic}}" --var topic="AI" --model gpt-4o-miniAvailable commands:
| Command | Description |
|---|---|
build |
Render a prompt with variable substitution |
validate |
Validate prompt completeness and token limits |
generate |
Build, validate, and produce a model-like response |
context |
Load, save, summarize, or clear conversation context |
analyze |
Estimate token usage and cost for prompts |
If you find PromptStream.AI helpful, please consider
⭐ starring the repository and ☕ supporting my work.
Your support helps keep the Flow.AI ecosystem growing.
Part of the Flow.AI Ecosystem
© 2025 Andrew Clements
