Skip to content

Commit 5ff9948

Browse files
Merge pull request #548 from telerik/yoan/q2-2025-release
GenAI changes
2 parents 7aed2d8 + 80fa936 commit 5ff9948

File tree

2 files changed

+7
-42
lines changed

2 files changed

+7
-42
lines changed

libraries/radpdfprocessing/features/gen-ai-powered-document-insights/partial-context-question-processor.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -83,6 +83,10 @@ A sample custom implementation for the OllamaEmbeddingsStorage is shown in the b
8383
> * **Telerik.Windows.Documents.AIConnector**
8484
> * **Telerik.Windows.Documents.Fixed**
8585
86+
1. Install Ollama from [ollama.com](https://ollama.com/).
87+
2. Pull the model you want to use.
88+
3. Start the Ollama server.
89+
8690
<snippet id='libraries-pdf-features-gen-ai-ask-questions-using-partial-context-ollama-embeddings-storage'/>
8791

8892
#### Example 3: Processing Specific Pages

libraries/radpdfprocessing/features/gen-ai-powered-document-insights/prerequisites.md

Lines changed: 3 additions & 42 deletions
Original file line numberDiff line numberDiff line change
@@ -63,24 +63,7 @@ Before using the GenAI-powered Document Insights functionality, you need to set
6363
6464
#### __[C#] Example 1: Setting up Azure OpenAI__
6565

66-
```csharp
67-
using Microsoft.Extensions.AI;
68-
using Azure.AI.OpenAI;
69-
70-
// Set up Azure OpenAI client
71-
string key = "your-azure-openai-key";
72-
string endpoint = "https://your-resource-name.openai.azure.com/";
73-
string deploymentName = "your-deployment-name";
74-
75-
AzureOpenAIClient azureClient = new(
76-
new Uri(endpoint),
77-
new Azure.AzureKeyCredential(key),
78-
new AzureOpenAIClientOptions());
79-
ChatClient chatClient = azureClient.GetChatClient(deploymentName);
80-
81-
IChatClient iChatClient = new OpenAIChatClient(chatClient);
82-
int maxTokenLimit = 128000; // Adjust based on your model
83-
```
66+
<snippet id='libraries-pdf-features-gen-ai-setup-azure-open-ai'/>
8467

8568
### OpenAI Setup
8669

@@ -89,18 +72,7 @@ int maxTokenLimit = 128000; // Adjust based on your model
8972

9073
#### __[C#] Example 2: Setting up OpenAI__
9174

92-
```csharp
93-
using Microsoft.Extensions.AI;
94-
using OpenAI;
95-
96-
// Set up OpenAI client
97-
string key = "your-openai-api-key";
98-
string modelId = "gpt-4o-mini";
99-
100-
OpenAIClient openAIClient = new OpenAIClient(key);
101-
IChatClient client = openAIClient.AsChatClient(modelId);
102-
int maxTokenLimit = 128000; // Adjust based on your model
103-
```
75+
<snippet id='libraries-pdf-features-gen-ai-setup-open-ai'/>
10476

10577
### Ollama Setup (Local AI)
10678

@@ -112,18 +84,7 @@ Ollama allows you to run AI models locally on your machine. This is useful for d
11284

11385
#### __[C#] Example 3: Setting up Ollama__
11486

115-
```csharp
116-
using Microsoft.Extensions.AI;
117-
118-
// Install and run Ollama:
119-
// 1. Install Ollama: https://ollama.com/
120-
// 2. Pull the model: ollama pull llama3
121-
// 3. Ensure Ollama is running: ollama serve
122-
123-
// Set up Ollama client
124-
IChatClient iChatClient = new OllamaChatClient(new Uri("http://localhost:11434/"), "llama3");
125-
int maxTokenLimit = 4096; // Adjust based on your model
126-
```
87+
<snippet id='libraries-pdf-features-gen-ai-setup-ollama-ai'/>
12788

12889
## See Also
12990

0 commit comments

Comments
 (0)