Skip to content

Commit

Permalink
[LLM-app] Cleaning old templates (#7262)
Browse files Browse the repository at this point in the history
GitOrigin-RevId: 76882172dcce6dde64af646cc8ab0d828ddf6c60
  • Loading branch information
olruas authored and Manul from Pathway committed Sep 11, 2024
1 parent 0385ea1 commit dde2bb0
Show file tree
Hide file tree
Showing 17 changed files with 10 additions and 333 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -189,7 +189,7 @@ Old values are still kept as the output is a log of insertion and suppression.
---
#default
- [Data indexing pipeline and RAG.](/developers/user-guide/llm-xpack/vectorstore_pipeline)
- [LLM-powered data pipeline.](/developers/templates/llm-alert-pathway)
- [Multimodal RAG.](/developers/templates/multimodal-rag)
- [Unstructured data to SQL on-the-fly.](/developers/templates/unstructured-to-structured)
::
::
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ Pick one and start your hands-on experience with Pathway today!
|--------------|--------------------------|------------------|-----------|
| **Basic LLM Tooling** | `pip install "pathway[xpack-llm]"` | Install common LLM libraries (OpenAI, Langchain, LlamaIndex) | [Learn more](/developers/user-guide/llm-xpack/overview) / [Examples](/developers/templates?category=llm#llm) |
| **Local LLM Deployment** | `pip install "pathway[xpack-llm-local]"` | Libraries for local deployment | |
| **Parsing Documents** | `pip install "pathway[xpack-llm-docs]"` | Tools for working with documents (PDFs, Microsoft Word) | [Contextful Parsing Pipeline](https://github.com/pathwaycom/llm-app/tree/main/examples/pipelines/contextful_parsing) |
| **Parsing Documents** | `pip install "pathway[xpack-llm-docs]"` | Tools for working with documents (PDFs, Microsoft Word) | |
| **Airbyte Connector** | `pip install "pathway[airbyte]"` | Support for Airbyte | [Example](/developers/templates/etl-python-airbyte/) |
| **SharePoint Connector** | `pip install "pathway[xpack-sharepoint]"` | Support for SharePoint | Requires a (free) [license key](/get-license) |
| **All** | `pip install "pathway[all]"` | Install all the optional packages | |
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -60,4 +60,4 @@ class: 'mx-auto'

Note, that the values of the column `messages` in the above example do not have spaces. It is a restriction of `pw.debug.table_from_markdown` which uses spaces to separate columns. Any regular string works with the other connectors.

If you want to see more examples with `pw.io.slack.send_alerts` you can check the [`alert`](https://github.com/pathwaycom/llm-app/tree/main/examples/pipelines/alert) or [`drive_alert`](https://github.com/pathwaycom/llm-app/tree/main/examples/pipelines/drive_alert) examples in the llm-app or our [showcase describing the drive alert example](/developers/templates/llm-alert-pathway/).
If you want to see more examples with `pw.io.slack.send_alerts` you can check the [`drive_alert`](https://github.com/pathwaycom/llm-app/tree/main/examples/pipelines/drive_alert) example in the llm-app.
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,6 @@
# date: '2023-11-15'
# thumbnail: '/assets/content/blog/th-computing-pagerank.png'
# tags: ['tutorial', 'engineering']
# related: ['/developers/templates/lsh/lsh_chapter1', '/developers/templates/llm-alert-pathway']
# keywords: ['index', 'indexing', 'join', 'asof join', 'asof_now', 'KNN']
# notebook_export_path: notebooks/tutorials/indexes.ipynb
# ---
Expand Down Expand Up @@ -431,8 +430,8 @@ class PointSchema(pw.Schema):
# run(data_dir=args.data_dir, host=args.host, port=args.port)
# ```
# %% [markdown]
# A similar approach was taken in our [alerting example](/developers/templates/llm-alert-pathway/).
# It is an LLM app that can send you alerts on slack when the response to your query has changed significantly.
# A similar approach was taken in our [alerting example](https://github.com/pathwaycom/llm-app/tree/main/examples/pipelines/drive_alert).
# It is an LLM app that can send you alerts on slack when the response to your query has changed significantly.
# %% [markdown]
# ## Summary
# In this article you learned about the differences in indexing between databases and Pathway. You can see that both approaches - keeping the queries to update them in the future or forgetting queries immediately after answering, are useful. It depends on your objective which approach should be used. Pathway provides methods to handle both variants.
68 changes: 1 addition & 67 deletions docs/2.developers/4.user-guide/70.llm-xpack/.llm-examples.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ If you want to see how it works, this page gathers practical examples using Path
</tbody>
</table>

## Simple examples
## Other examples

<table class="w-full">
<!-- <thead>
Expand All @@ -61,40 +61,6 @@ If you want to see how it works, this page gathers practical examples using Path
</tr>
</thead> -->
<tbody>
<tr>
<td class="text-center">
<a href="https://github.com/pathwaycom/llm-app/tree/main/examples/pipelines/contextless">Contextless Pipeline</a>
</td>
<td class="text-center">
This example implements a pipeline that answers a single question, without any context.
The query is asked such as it is, without prompt engineering, to the model.
</td>
</tr>
<tr>
<td class="text-center">
<a href="https://github.com/pathwaycom/llm-app/tree/main/examples/pipelines/contextful">Contextful Pipeline</a>
</td>
<td class="text-center">
This example implements a simple pipeline that answers questions based on documents in a given folder.
The query is enhanced using prompt engineering: the most relevant documents are retrieved from the folder, and added to the prompt to allow the LLM model to have more context to answer the query.
</td>
</tr>
<tr>
<td class="text-center">
<a href="https://github.com/pathwaycom/llm-app/tree/main/examples/pipelines/contextful_s3">Contextful S3 Pipeline</a>
</td>
<td class="text-center">
This example implements a pipeline similar to the "contextful" one, it answers questions based on documents stored in S3.
</td>
</tr>
<tr>
<td class="text-center">
<a href="https://github.com/pathwaycom/llm-app/tree/main/examples/pipelines/local">Local Pipeline</a>
</td>
<td class="text-center">
This pipeline is similar to the contextful pipeline, but relies on local computations, rather than querying external API.
</td>
</tr>
<tr>
<td class="text-center">
<a href="https://github.com/pathwaycom/llm-app/tree/main/examples/pipelines/demo-document-indexing">Realtime Document Indexing with Pathway</a>
Expand All @@ -103,30 +69,6 @@ If you want to see how it works, this page gathers practical examples using Path
Basic example of a real-time document indexing pipeline powered by Pathway. You can index documents from different data sources, such as SharePoint or Google Drive. You can then query the index to retrieve documents, get statistics about the index, and retrieve file metadata.
</td>
</tr>
<tr>
<td class="text-center">
<a href="https://github.com/pathwaycom/llm-app/tree/main/examples/pipelines/contextful_parsing">Contextful Parsing Pipeline</a>
</td>
<td class="text-center">
This example implements a RAG pipeline, similarly to contextful pipeline. It parses unstructured documents such as PDFs.
</td>
</tr>
</tbody>
</table>

## Advanced examples


<table class="w-full">
<tbody>
<tr>
<td class="text-center">
<a href="https://github.com/pathwaycom/llm-app/tree/main/examples/pipelines/alert">Alert Pipeline</a>
</td>
<td class="text-center">
Example implementing a pipeline that answers questions based on documents in a given folder. Additionally, in your prompts you can ask to be notified of any changes - in such case an alert will be sent to a Slack channel.
</td>
</tr>
<tr>
<td class="text-center">
<a href="https://github.com/pathwaycom/llm-app/tree/main/examples/pipelines/drive_alert">Drive Alert Pipeline</a>
Expand All @@ -145,14 +87,6 @@ If you want to see how it works, this page gathers practical examples using Path
You can read more about this example in [our article](https://pathway.com/developers/templates/unstructured-to-structured/) about it.
</td>
</tr>
<tr>
<td class="text-center">
<a href="https://github.com/pathwaycom/llm-app/tree/main/examples/pipelines/contextful_geometric">Contextful Geometric Pipeline</a>
</td>
<td class="text-center">
This example implements a pipeline that answers questions based on documents in a given folder. To get the answer it sends increasingly more documents to the LLM chat until it can find an answer. You can read more about the reasoning behind this approach <a href="https://pathway.com/developers/templates/adaptive-rag">here</a>.
</td>
</tr>
</tbody>
</table>

Expand Down
2 changes: 1 addition & 1 deletion docs/2.developers/4.user-guide/70.llm-xpack/10.overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -161,7 +161,7 @@ texts = documents.select(chunk=splitter(pw.this.text))

`TokenCountSplitter` returns data in the same format as `ParseUnstructured` - that is for each row it returns a list of tuples, where each tuple consists of a string with the text of a chunk and a dictionary with associated metadata.

With these tools it is easy to create in Pathway a pipeline serving as a Vector Store, but which updates on each data change. You can check such an example in [the llm-app repository](https://github.com/pathwaycom/llm-app/blob/main/examples/pipelines/contextful-parsing/app.py). As it is a common pipeline, Pathway provides a [class `VectorStore`](/developers/api-docs/pathway-xpacks-llm/vectorstore#pathway.xpacks.llm.vector_store.VectorStoreServer) which implements this pipeline.
With these tools it is easy to create in Pathway a pipeline serving as a Vector Store, but which updates on each data change. You can check such an example in [the llm-app repository](https://github.com/pathwaycom/llm-app/blob/main/examples/pipelines/demo-question-answering/app.py). As it is a common pipeline, Pathway provides a [class `VectorStore`](/developers/api-docs/pathway-xpacks-llm/vectorstore#pathway.xpacks.llm.vector_store.VectorStoreServer) which implements this pipeline.


## Ready-to-use Vector Store
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ zoomable: true
---
::

For the ready implementation of the app from this guide, visit our GitHub repository at [llm-app](https://https://github.com/pathwaycom/llm-app/tree/main/examples/pipelines/contextful).
For the ready implementation of the app from this guide, visit our GitHub repository at [llm-app](https://https://github.com/pathwaycom/llm-app/tree/main/examples/pipelines/demo-question-answering).

::shoutout-banner
---
Expand Down
1 change: 0 additions & 1 deletion docs/2.developers/7.templates/.adaptive-rag/article.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,6 @@
# extra_info: joint work with Jacek Kowalski, Szymon Dudycz
# keywords: ['LLM', 'RAG', 'Adaptive RAG', 'prompt engineering', 'prompt', 'explainability', 'notebook', 'Docker']
# run_template: "/developers/templates/template-adaptive-rag"
# popular: true
# ---

# # Adaptive RAG: cut your LLM costs without sacrificing accuracy
Expand Down
2 changes: 1 addition & 1 deletion docs/2.developers/7.templates/.multimodal-rag/article.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
# thumbnailFit: 'contain'
# date: '2024-06-20'
# tags: ['showcase', 'llm']
# keywords: ['LLM', 'RAG', 'GPT', 'OpenAI', 'GPT-4o', 'multimodal RAG', 'unstructured', 'notebook']
# keywords: ['LLM', 'RAG', 'GPT', 'OpenAI', 'GPT-4o', 'multimodal RAG', 'unstructured', 'notebook', 'docker']
# notebook_export_path: notebooks/showcases/multimodal-rag.ipynb
# run_template: "/developers/templates/template-multimodal-rag"
# popular: true
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,6 @@
# author: 'berke'
# keywords: ['LLM', 'RAG', 'Adaptive RAG', 'prompt engineering', 'explainability', 'mistral', 'ollama', 'private rag', 'local rag', 'ollama rag', 'notebook', 'docker']
# run_template: "/developers/templates/template-private-rag"
# popular: true
# ---

# # Private RAG with Connected Data Sources using Mistral, Ollama, and Pathway
Expand Down
13 changes: 0 additions & 13 deletions docs/2.developers/7.templates/1005.template-contextful.md

This file was deleted.

13 changes: 0 additions & 13 deletions docs/2.developers/7.templates/1006.template-contextful-s3.md

This file was deleted.

13 changes: 0 additions & 13 deletions docs/2.developers/7.templates/1007.template-local-rag.md

This file was deleted.

13 changes: 0 additions & 13 deletions docs/2.developers/7.templates/1009.template-contextful-parsing.md

This file was deleted.

14 changes: 0 additions & 14 deletions docs/2.developers/7.templates/1010.template-alert.md

This file was deleted.

Loading

0 comments on commit dde2bb0

Please sign in to comment.