-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Introduce FastAPI app for LLM-based research question generation #37
Conversation
Reviewer's Guide by SourceryThis PR introduces a new FastAPI service for generating research questions using the Ollama LLM. The implementation includes a robust error handling system, async client management, and structured request/response models using Pydantic. The service communicates with the Ollama API to generate a main research question and supporting sub-questions based on a given topic. Sequence diagram for Research Question Generation ProcesssequenceDiagram
actor User
participant FastAPI
participant ResearchService
participant OllamaAPI
User->>FastAPI: POST /research {topic}
FastAPI->>ResearchService: generate_research(input_data)
ResearchService->>OllamaAPI: chat(model, messages)
OllamaAPI-->>ResearchService: response
ResearchService->>FastAPI: ResearchResponse
FastAPI-->>User: ResearchResponse
Class diagram for Research Question Generation ServiceclassDiagram
class ResearchInput {
+String topic
}
class ResearchResponse {
+String main_question
+List~String~ sub_questions
}
class AsyncClient {
}
class FastAPI {
+String title
+String description
+String version
}
class HTTPException {
+int status_code
+String detail
}
class Logger {
+info(String)
+error(String)
}
class ResearchService {
+generate_research(ResearchInput) ResearchResponse
+generate_research_questions(String) Dict
}
ResearchService --> ResearchInput
ResearchService --> ResearchResponse
ResearchService --> AsyncClient
ResearchService --> HTTPException
ResearchService --> Logger
FastAPI --> ResearchService
note for ResearchService "Handles research question generation and API endpoint"
File-Level Changes
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @leonvanbokhorst - I've reviewed your changes - here's some feedback:
Overall Comments:
- The
get_client()
context manager has an emptyfinally
block. If no cleanup is needed, consider removing the context manager. If cleanup is needed (e.g., closing the client connection), implement it in thefinally
block.
Here's what I looked at during the review
- 🟢 General issues: all looks good
- 🟢 Security: all looks good
- 🟢 Testing: all looks good
- 🟢 Complexity: all looks good
- 🟢 Documentation: all looks good
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
Co-authored-by: sourcery-ai[bot] <58596630+sourcery-ai[bot]@users.noreply.github.com>
Summary by Sourcery
New Features: