Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add feynbot-ir #2

Merged
merged 2 commits into from
Jan 20, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions .dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
.*
venv/
__pycache__/
*.pyc
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -166,3 +166,6 @@ cython_debug/
# and can be added to the global gitignore or merged into this file. For a more nuclear
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
#.idea/

.gradio/
.python-version
5 changes: 4 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,11 +1,13 @@
# Feynbot: talking with INSPIRE

Feynbot is designed to help researchers explore high-energy physics content more intuitively by providing conversational access to the INSPIRE database for scientific literature.

## Usage guide

This guide assumes you have OpenSearch already available and running in your system.

### Local installation

1. Create a virtual environment and activate it:

`python3 -m venv .venv`
Expand All @@ -25,6 +27,7 @@ This guide assumes you have OpenSearch already available and running in your sys
`python3 src/demo.py`

### Docker

1. Build the Docker image:

`docker build -t feynbot .`
Expand All @@ -33,4 +36,4 @@ This guide assumes you have OpenSearch already available and running in your sys

`docker run -e OPENAI_API_KEY=<your_openai_api_key> -p 7860:7860 feynbot`

This will start the Gradio web app, which will be accesible at http://localhost:7860.
This will start the Gradio web app, which will be accesible at http://localhost:7860.
Binary file removed assets/feynbot.png
Binary file not shown.
83 changes: 44 additions & 39 deletions requirements.txt
Original file line number Diff line number Diff line change
@@ -1,98 +1,103 @@
aiofiles==23.2.1
aiohappyeyeballs==2.4.4
aiohttp==3.11.10
aiosignal==1.3.1
aiohttp==3.11.11
aiosignal==1.3.2
annotated-types==0.7.0
anyio==4.7.0
attrs==24.2.0
anyio==4.8.0
attrs==24.3.0
beautifulsoup4==4.12.3
certifi==2024.8.30
charset-normalizer==3.4.0
click==8.1.7
certifi==2024.12.14
charset-normalizer==3.4.1
click==8.1.8
dataclasses-json==0.6.7
Deprecated==1.2.15
dirtyjson==1.0.8
distro==1.9.0
Events==0.5
fastapi==0.115.6
ffmpy==0.4.0
ffmpy==0.5.0
filelock==3.16.1
filetype==1.2.0
frozenlist==1.5.0
fsspec==2024.10.0
gradio==5.8.0
gradio_client==1.5.1
fsspec==2024.12.0
gradio==5.12.0
gradio_client==1.5.4
greenlet==3.1.1
h11==0.14.0
httpcore==1.0.7
httpx==0.27.2
huggingface-hub==0.26.5
huggingface-hub==0.27.1
idna==3.10
Jinja2==3.1.4
jiter==0.8.0
Jinja2==3.1.5
jiter==0.8.2
joblib==1.4.2
llama-cloud==0.1.6
llama-index==0.12.3
llama-index-agent-openai==0.4.0
llama-cloud==0.1.9
llama-index==0.12.11
llama-index-agent-openai==0.4.2
llama-index-cli==0.4.0
llama-index-core==0.12.3
llama-index-embeddings-ollama==0.4.0
llama-index-core==0.12.11
llama-index-embeddings-ollama==0.5.0
llama-index-embeddings-openai==0.3.1
llama-index-indices-managed-llama-cloud==0.6.3
llama-index-legacy==0.9.48.post4
llama-index-llms-ollama==0.4.2
llama-index-readers-file==0.4.1
llama-index-llms-ollama==0.5.0
llama-index-llms-openai==0.3.13
llama-index-multi-modal-llms-openai==0.4.2
llama-index-program-openai==0.3.1
llama-index-question-gen-openai==0.3.0
llama-index-readers-file==0.4.3
llama-index-readers-llama-parse==0.4.0
llama-index-vector-stores-opensearch==0.5.0
llama-parse==0.5.17
llama-index-vector-stores-opensearch==0.5.2
llama-parse==0.5.19
markdown-it-py==3.0.0
MarkupSafe==2.1.5
marshmallow==3.23.1
marshmallow==3.25.1
mdurl==0.1.2
multidict==6.1.0
mypy-extensions==1.0.0
nest-asyncio==1.6.0
networkx==3.4.2
nltk==3.9.1
numpy==2.1.3
ollama==0.3.3
numpy==2.2.1
ollama==0.4.6
openai==1.59.7
opensearch-py==2.8.0
orjson==3.10.12
orjson==3.10.14
packaging==24.2
pandas==2.2.3
pillow==11.0.0
pillow==11.1.0
propcache==0.2.1
pydantic==2.9.2
pydantic_core==2.23.4
pydantic==2.10.5
pydantic_core==2.27.2
pydub==0.25.1
Pygments==2.18.0
Pygments==2.19.1
pypdf==5.1.0
python-dateutil==2.9.0.post0
python-multipart==0.0.19
python-multipart==0.0.20
pytz==2024.2
PyYAML==6.0.2
regex==2024.11.6
requests==2.32.3
rich==13.9.4
ruff==0.8.2
ruff==0.9.2
safehttpx==0.1.6
semantic-version==2.10.0
shellingham==1.5.4
six==1.17.0
sniffio==1.3.1
soupsieve==2.6
SQLAlchemy==2.0.36
SQLAlchemy==2.0.37
starlette==0.41.3
striprtf==0.0.26
tenacity==8.5.0
tenacity==9.0.0
tiktoken==0.8.0
tomlkit==0.13.2
tqdm==4.67.1
typer==0.15.1
typing-inspect==0.9.0
typing_extensions==4.12.2
tzdata==2024.2
urllib3==2.2.3
uvicorn==0.32.1
urllib3==2.3.0
uvicorn==0.34.0
websockets==14.1
wrapt==1.17.0
wrapt==1.17.2
yarl==1.18.3
58 changes: 41 additions & 17 deletions src/demo.py
Original file line number Diff line number Diff line change
@@ -1,44 +1,68 @@
import json
import gradio as gr
from utils import get_response, load_config
from feynbot.app import get_response, load_config
from feynbot_ir.app import search

if __name__ == "__main__":
config = load_config("config.yaml")
with open(config["gradio"]["questions"], "r") as f:
questions = json.load(f)

demo = gr.Interface(
FOOTER = ('Developed by: <img src="https://sinai.ujaen.es/sites/default/files/SINAI%20-%20logo%20tx%20azul%20%5Baf%5D.png" alt="Feynbot" width="150">'
"Hosted at CERN by the SIS team")

feynbot = gr.Interface(
fn=get_response,
inputs=[
gr.Textbox(
value="",
lines=3,
placeholder="Ask Feynbot anything...",
label="Question"
label="Question",
),
gr.Dropdown(
[question for question in questions.values()],
label="Examples",
info="Pick a question"
)
],
outputs=[
gr.Textbox(label="Answer", lines=5),
gr.Markdown(label="References")
info="Pick a question",
),
],
title="Feynbot: talking with INSPIRE",
description=(
'Developed by: <a href="https://sinai.ujaen.es/"><img src="https://sinai.ujaen.es/sites/default/files/SINAI%20-%20logo%20tx%20azul%20%5Baf%5D.png" alt="Feynbot" width="150"></a>'
'<br>'
'<p>Ask anything or pick an example question from the dropdown below.</p>'
),
outputs=[gr.Textbox(label="Answer", lines=5), gr.Markdown(label="References")],
title="Feynbot Base",
description="Ask anything or pick an example question from the dropdown below.",
allow_flagging=config["gradio"]["allow_flagging"],
flagging_dir=config["gradio"]["flagging_dir"]
flagging_dir=config["gradio"]["flagging_dir"],
article=FOOTER,
)

with gr.Blocks() as feynbot_ir:
gr.Markdown("<h1 style='text-align: center;'>Feynbot IR on INSPIRE HEP Search</h1>")
gr.Markdown("""Specialized academic search tool that combines traditional
database searching with AI-powered query expansion and result
synthesis, focused on physics research papers.""")
with gr.Row():
with gr.Column():
query = gr.Textbox(label="Search Query", placeholder="Ask Feynbot anything...", lines=3)
model = gr.Dropdown(
choices=["llama3.2", "llama3.1:8b", "gemma2:27b", "mistral-small"],
value="llama3.1:8b",
label="Model (select or free-text)",
allow_custom_value=True
)
examples = gr.Examples([["Which is the closest star?"], ["Which particles does the Higgs Boson decay into?"]], query)
search_btn = gr.Button("Search")
gr.HTML(FOOTER)
with gr.Column():
results = gr.Markdown("Answer will appear here...", label="Search Results", )
search_btn.click(fn=search, inputs=[query, model], outputs=results, api_name="search", show_progress=True)

demo = gr.TabbedInterface(
[feynbot_ir, feynbot], ["Feynbot IR", "Feynbot Base"], theme="citrus"
)

demo.launch(
server_name="0.0.0.0",
share=config["gradio"]["share"],
root_path="/feynbot",
show_api=False
show_api=False,
debug=True,
)
Loading