Skip to content
This repository was archived by the owner on Jul 16, 2025. It is now read-only.

feat: add Fabric patterns support #365

Closed
wants to merge 21 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
21 commits
Select commit Hold shift + click to select a range
e77de54
feat: add Fabric patterns support
OskarStark Jun 29, 2025
7f4057c
fix
OskarStark Jun 29, 2025
9523a7a
chore: add php-llm/fabric-pattern to require-dev and suggest
OskarStark Jun 29, 2025
7fed491
refactor: simplify fabric examples with directory check
OskarStark Jun 29, 2025
011e734
feat: add fabric pattern names to example outputs
OskarStark Jun 29, 2025
5253ef7
style: add missing newlines at end of example files
OskarStark Jun 29, 2025
ed498d8
refactor: replace isset with array_key_exists
OskarStark Jun 29, 2025
bcf494b
refactor: make FabricRepository mandatory in FabricInputProcessor
OskarStark Jun 29, 2025
f1e37c8
feat: add non-empty-string type annotation for pattern
OskarStark Jun 29, 2025
4064979
-
OskarStark Jun 29, 2025
c57da11
fix: check for Pattern class existence instead of directory structure
OskarStark Jul 14, 2025
77d9a5d
refactor: update fabric package detection and remove comments
OskarStark Jul 14, 2025
05709cb
chore: remove redundant comment from FabricInputProcessor
OskarStark Jul 14, 2025
02be26a
chore: combine error message into single echo statement
OskarStark Jul 14, 2025
d128ec0
refactor: use Pattern class existence check instead of directory path
OskarStark Jul 14, 2025
c933b88
feat: add validation to prevent multiple system messages
OskarStark Jul 14, 2025
80d4a41
refactor: use imported Pattern class instead of FQCN
OskarStark Jul 14, 2025
8a39981
refactor: simplify Fabric integration by using Pattern class directly
OskarStark Jul 14, 2025
8f2f5d3
-
OskarStark Jul 14, 2025
7da2538
Use package-specific exceptions instead of generic PHP exceptions
OskarStark Jul 15, 2025
b1327e5
Fix CS: Add missing newline at end of LogicException.php
OskarStark Jul 15, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
141 changes: 108 additions & 33 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -801,7 +801,6 @@ final class MyProcessor implements OutputProcessorInterface, ChainAwareInterface
}
```


## Memory

LLM Chain supports adding contextual memory to your conversations, which allows the model to recall past interactions or relevant information from different sources. Memory providers inject information into the system prompt, providing the model with context without changing your application logic.
Expand All @@ -818,64 +817,140 @@ use PhpLlm\LlmChain\Chain\Memory\StaticMemoryProvider;

// Platform & LLM instantiation

$personalFacts = new StaticMemoryProvider(
'My name is Wilhelm Tell',
'I wish to be a swiss national hero',
'I am struggling with hitting apples but want to be professional with the bow and arrow',
// Create memory providers
$conversationHistory = new StaticMemoryProvider([
'User: What is the capital of France?',
'Assistant: The capital of France is Paris.',
]);

$contextualInfo = new StaticMemoryProvider([
'The user prefers concise answers',
'Current date: ' . date('Y-m-d'),
]);

// Create memory processor with providers
$memoryProcessor = new MemoryInputProcessor(
$conversationHistory,
$contextualInfo,
);
$memoryProcessor = new MemoryInputProcessor($personalFacts);

$chain = new Chain($platform, $model, [$memoryProcessor]);
$messages = new MessageBag(Message::ofUser('What do we do today?'));
$response = $chain->call($messages);
// Add to chain
$chain = new Chain(
$platform,
$model,
inputProcessors: [$memoryProcessor]
);

$response = $chain->call('What did I just ask you about?');
// The model now has access to the conversation history and can reference the previous question
```

### Memory Providers
### Available Memory Providers

#### Static Memory Provider

The library includes some implementations that are usable out of the box.
Pre-defined memories that don't change during execution:

```php
$staticMemory = new StaticMemoryProvider([
'User profile: Developer interested in AI',
'Application context: Customer support chatbot',
]);
```

#### Static Memory
#### Document Store Memory Provider

The static memory can be utilized to provide static information form, for example, user settings, basic knowledge of your application
or any other thing that should be remembered als always there without the need of having it statically added to the system prompt by
yourself.
Dynamically fetch relevant memories from a document store:

```php
use PhpLlm\LlmChain\Chain\Memory\StaticMemoryProvider;
use PhpLlm\LlmChain\Chain\Memory\DocumentStoreMemoryProvider;

$documentStore = // ... your document store instance
$embeddingsModel = // ... your embeddings model

$staticMemory = new StaticMemoryProvider(
'The user is allergic to nuts',
'The user prefers brief explanations',
$memoryProvider = new DocumentStoreMemoryProvider(
$documentStore,
$embeddingsModel,
10 // number of relevant documents to retrieve
);
```

#### Embedding Provider
This provider will search for relevant documents based on the current input and include them as context.

Based on an embedding storage the given user message is utilized to inject knowledge from the storage. This could be general knowledge that was stored there and could fit the users input without the need for tools or past conversation pieces that should be recalled for
the current message bag.
### Controlling Memory Usage

You can enable or disable memory on a per-call basis using the `use_memory` option:

```php
use PhpLlm\LlmChain\Chain\Memory\EmbeddingProvider;
// Disable memory for this specific call
$response = $chain->call('What is 2+2?', [
'use_memory' => false,
]);
```

$embeddingsMemory = new EmbeddingProvider(
$platform,
$embeddings, // Your embeddings model to use for vectorizing the users message
$store // Your vector store to query for fitting context
);
### Fabric Patterns

```
LLM Chain supports [Fabric](https://github.com/danielmiessler/fabric), a popular collection of system prompts from the AI community. These patterns provide pre-built, tested prompts for common tasks like summarization, analysis, and content creation.

### Dynamically Memory Usage
> [!NOTE]
> Using Fabric patterns requires the `php-llm/fabric-pattern` package to be installed separately.

The memory configuration is globally given for the chain. Sometimes there is the need to explicit disable the memory when it is not needed for some calls or calls are not in the wanted context for a call. So there is the option `use_memory` that is enabled by default but can be disabled on premise.
#### Installation

```bash
composer require php-llm/fabric-pattern
```

#### Usage

```php
$response = $chain->call($messages, [
'use_memory' => false,
use PhpLlm\LlmChain\Platform\Fabric\FabricPrompt;
use PhpLlm\LlmChain\Platform\Message\Message;

// Create a system message from a Fabric pattern
$systemMessage = Message::fabric(FabricPrompt::summarize());

// Or customize it with additional instructions
$systemMessage = Message::fabric(FabricPrompt::summarize('Focus on technical details'));

$chain = new Chain($platform, $model);
$response = $chain->call([
$systemMessage,
Message::forUser('Your long article text here...')
]);
```

#### Usage with Input Processor

For more flexibility, you can use the `FabricInputProcessor` to dynamically load patterns:

```php
use PhpLlm\LlmChain\Chain\Chain;
use PhpLlm\LlmChain\Platform\Fabric\FabricInputProcessor;

// Initialize Platform and LLM

$processor = new FabricInputProcessor();
$chain = new Chain($platform, $model, [$processor]);

$messages = new MessageBag(
Message::ofUser('Analyze this article for potential security issues: ...')
);

// Use any Fabric pattern via options
$response = $chain->call($messages, ['fabric_pattern' => 'analyze_threat_report']);
```

#### Available Patterns

Some popular patterns include:
- `summarize` - Create concise summaries
- `analyze_claims` - Fact-check and analyze claims
- `extract_wisdom` - Extract key insights and wisdom
- `improve_writing` - Improve writing quality and clarity
- `create_quiz` - Generate quiz questions from content

For a full list of available patterns, visit the [Fabric patterns directory](https://github.com/danielmiessler/fabric/tree/main/patterns).
## HuggingFace

LLM Chain comes out of the box with an integration for [HuggingFace](https://huggingface.co/) which is a platform for
Expand Down
13 changes: 13 additions & 0 deletions composer.json
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,7 @@
"mrmysql/youtube-transcript": "^v0.0.5",
"php-cs-fixer/shim": "^3.70",
"php-http/discovery": "^1.20",
"php-llm/fabric-pattern": "^0.1",
"phpstan/phpstan": "^2.0",
"phpstan/phpstan-symfony": "^2.0",
"phpstan/phpstan-webmozart-assert": "^2.0",
Expand All @@ -63,6 +64,18 @@
"symfony/process": "^6.4 || ^7.1",
"symfony/var-dumper": "^6.4 || ^7.1"
},
"suggest": {
"ext-pdo": "For using MariaDB as retrieval vector store.",
"async-aws/bedrock-runtime": "For using the Bedrock platform.",
"codewithkyrian/chromadb-php": "For using the ChromaDB as retrieval vector store.",
"codewithkyrian/transformers": "For using the TransformersPHP with FFI to run models in PHP.",
"doctrine/dbal": "For using MariaDB via Doctrine as retrieval vector store",
"mongodb/mongodb": "For using MongoDB Atlas as retrieval vector store.",
"mrmysql/youtube-transcript": "For using the YouTube transcription tool.",
"php-llm/fabric-pattern": "For using Fabric patterns - a collection of pre-built system prompts.",
"probots-io/pinecone-php": "For using the Pinecone as retrieval vector store.",
"symfony/dom-crawler": "For using the Crawler tool."
},
"config": {
"allow-plugins": {
"codewithkyrian/transformers-libsloader": true,
Expand Down
50 changes: 50 additions & 0 deletions examples/fabric/summarize.php
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
<?php

declare(strict_types=1);

use PhpLlm\FabricPattern\Pattern;
use PhpLlm\LlmChain\Chain\Chain;
use PhpLlm\LlmChain\Platform\Bridge\OpenAI\GPT;
use PhpLlm\LlmChain\Platform\Bridge\OpenAI\PlatformFactory;
use PhpLlm\LlmChain\Platform\Message\Message;
use PhpLlm\LlmChain\Platform\Message\MessageBag;

require_once dirname(__DIR__).'/../vendor/autoload.php';

if (empty($_ENV['OPENAI_API_KEY'])) {
echo 'Please set the OPENAI_API_KEY environment variable.'.\PHP_EOL;
exit(1);
}

if (!class_exists(Pattern::class)) {
echo 'Fabric patterns are not installed. Please install them with: composer require php-llm/fabric-pattern'.\PHP_EOL;
exit(1);
}

$platform = PlatformFactory::create($_ENV['OPENAI_API_KEY']);
$model = new GPT(GPT::GPT_4O_MINI);
$chain = new Chain($platform, $model);

$article = <<<'ARTICLE'
The field of artificial intelligence has undergone dramatic transformations in recent years,
with large language models (LLMs) emerging as one of the most significant breakthroughs.
These models, trained on vast amounts of text data, have demonstrated remarkable capabilities
in understanding and generating human-like text. The implications for software development,
content creation, and human-computer interaction are profound.

However, with these advances come important considerations regarding ethics, bias, and the
responsible deployment of AI systems. Researchers and practitioners must work together to
ensure that these powerful tools are used in ways that benefit society while minimizing
potential harms.
ARTICLE;

$messages = new MessageBag(
Message::fabric('create_summary'),
Message::ofUser($article)
);

$response = $chain->call($messages);

echo 'Summary using Fabric pattern "create_summary":'.\PHP_EOL;
echo '=============================================='.\PHP_EOL;
echo $response->getContent().\PHP_EOL;
55 changes: 55 additions & 0 deletions examples/fabric/with-processor.php
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
<?php

declare(strict_types=1);

use PhpLlm\FabricPattern\Pattern;
use PhpLlm\LlmChain\Chain\Chain;
use PhpLlm\LlmChain\Platform\Bridge\OpenAI\GPT;
use PhpLlm\LlmChain\Platform\Bridge\OpenAI\PlatformFactory;
use PhpLlm\LlmChain\Platform\Fabric\FabricInputProcessor;
use PhpLlm\LlmChain\Platform\Message\Message;
use PhpLlm\LlmChain\Platform\Message\MessageBag;

require_once dirname(__DIR__).'/../vendor/autoload.php';

if (empty($_ENV['OPENAI_API_KEY'])) {
echo 'Please set the OPENAI_API_KEY environment variable.'.\PHP_EOL;
exit(1);
}

if (!class_exists(Pattern::class)) {
echo 'Fabric patterns are not installed. Please install them with: composer require php-llm/fabric-pattern'.\PHP_EOL;
exit(1);
}

// Initialize platform and model
$platform = PlatformFactory::create($_ENV['OPENAI_API_KEY']);
$model = new GPT(GPT::GPT_4O_MINI);

// Create chain with Fabric processor
$processor = new FabricInputProcessor();
$chain = new Chain($platform, $model, [$processor]);

// Example code to analyze
$code = <<<'CODE'
function processUserData($data) {
$sql = "SELECT * FROM users WHERE id = " . $data['id'];
$result = mysql_query($sql);

while ($row = mysql_fetch_array($result)) {
echo $row['name'] . " - " . $row['email'];
}
}
CODE;

// Create messages
$messages = new MessageBag(
Message::ofUser("Analyze this PHP code for security issues:\n\n".$code)
);

// Call with Fabric pattern
$response = $chain->call($messages, ['fabric_pattern' => 'analyze_code']);

echo 'Code Analysis using Fabric pattern "analyze_code":'.\PHP_EOL;
echo '=================================================='.\PHP_EOL;
echo $response->getContent().\PHP_EOL;
9 changes: 9 additions & 0 deletions src/Platform/Exception/LogicException.php
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
<?php

declare(strict_types=1);

namespace PhpLlm\LlmChain\Platform\Exception;

class LogicException extends \LogicException implements ExceptionInterface
{
}
49 changes: 49 additions & 0 deletions src/Platform/Fabric/FabricInputProcessor.php
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
<?php

declare(strict_types=1);

namespace PhpLlm\LlmChain\Platform\Fabric;

use PhpLlm\FabricPattern\Pattern;
use PhpLlm\LlmChain\Chain\Input;
use PhpLlm\LlmChain\Chain\InputProcessorInterface;
use PhpLlm\LlmChain\Platform\Exception\InvalidArgumentException;
use PhpLlm\LlmChain\Platform\Exception\LogicException;
use PhpLlm\LlmChain\Platform\Exception\RuntimeException;
use PhpLlm\LlmChain\Platform\Message\SystemMessage;

/**
* Requires the "php-llm/fabric-pattern" package to be installed.
*/
final readonly class FabricInputProcessor implements InputProcessorInterface
{
public function processInput(Input $input): void
{
$options = $input->getOptions();

if (!\array_key_exists('fabric_pattern', $options)) {
return;
}

$pattern = $options['fabric_pattern'];
if (!\is_string($pattern)) {
throw new InvalidArgumentException('The "fabric_pattern" option must be a string');
}

if (null !== $input->messages->getSystemMessage()) {
throw new LogicException('Cannot add Fabric pattern: MessageBag already contains a system message');
}

if (!class_exists(Pattern::class)) {
throw new RuntimeException('Fabric patterns not found. Please install the "php-llm/fabric-pattern" package: composer require php-llm/fabric-pattern');
}

$content = (new Pattern())->load($pattern);
$systemMessage = new SystemMessage($content);

$input->messages = $input->messages->prepend($systemMessage);

unset($options['fabric_pattern']);
$input->setOptions($options);
}
}
Loading
Loading