Skip to content

Commit

Permalink
sort imports
Browse files Browse the repository at this point in the history
  • Loading branch information
rasoro committed Sep 27, 2023
1 parent 8f7caf6 commit 3c2c71b
Show file tree
Hide file tree
Showing 7 changed files with 17 additions and 11 deletions.
4 changes: 2 additions & 2 deletions .github/workflows/ci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -30,9 +30,9 @@ jobs:
- name: Run tests
run: |
poetry run coverage run -m unittest discover ./app/tests/
poetry run coverage report -m
poetry run coverage report
poetry run coverage xml
working-directory: ${{ github.workspace }}

- name: Upload coverage report
uses: codecov/codecov-action@v2

8 changes: 5 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[![CI](https://github.com/weni-ai/SentenX/actions/workflows/ci.yaml/badge.svg)](https://github.com/weni-ai/SentenX/actions/workflows/ci.yaml)

# SentenX

microservice that uses a sentence transformer model to index and search records.
Expand Down Expand Up @@ -52,7 +54,7 @@ uvicorn app.main:main_app.api --reload

### Docker compose

to start sentenx with elasticsearch on with docker compose:
to start sentenx with elasticsearch with docker compose:

setup `AWS_SECRET_ACCESS_KEY` and `AWS_ACCESS_KEY_ID` on `docker-compose.yml`
```
Expand All @@ -66,7 +68,7 @@ docker compose down

to start with rebuild after any change on source:
```
docker compose up -d --
docker compose up -d --build
```


Expand Down Expand Up @@ -167,7 +169,7 @@ status: 200
"products": [
{
"facebook_id": "1",
"title": "leite em pó 200g",
"title": "massa para bolo de baunilha",
"org_id": "1",
"channel_id": "5",
"catalog_id": "asdfgh4321",
Expand Down
3 changes: 2 additions & 1 deletion app/handlers/products.py
Original file line number Diff line number Diff line change
@@ -1,8 +1,9 @@
from fastapi import APIRouter, HTTPException
from fastapi.logger import logger
from pydantic import BaseModel

from app.handlers import IDocumentHandler
from app.indexer import IDocumentIndexer
from pydantic import BaseModel


class Product(BaseModel):
Expand Down
1 change: 1 addition & 0 deletions app/indexer/products.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
from langchain.docstore.document import Document

from app.handlers.products import Product
from app.indexer import IDocumentIndexer
from app.store import IStorage
Expand Down
10 changes: 5 additions & 5 deletions app/main.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,8 @@
from fastapi import FastAPI
from langchain.embeddings import SagemakerEndpointEmbeddings, HuggingFaceHubEmbeddings
from langchain.embeddings.base import Embeddings
from langchain.vectorstores import ElasticVectorSearch, VectorStore

from app.handlers import IDocumentHandler
from app.handlers.products import ProductsHandler
from app.indexer import IDocumentIndexer
Expand All @@ -6,11 +11,6 @@
from app.config import AppConfig
from app.util import ContentHandler

from fastapi import FastAPI
from langchain.embeddings import SagemakerEndpointEmbeddings, HuggingFaceHubEmbeddings
from langchain.embeddings.base import Embeddings
from langchain.vectorstores import ElasticVectorSearch, VectorStore


class App:
api: FastAPI
Expand Down
1 change: 1 addition & 0 deletions app/store/elasticsearch_vector_store.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
from langchain.vectorstores import VectorStore
from langchain.docstore.document import Document

from app.store import IStorage


Expand Down
1 change: 1 addition & 0 deletions app/util.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
import json

from langchain.embeddings.sagemaker_endpoint import EmbeddingsContentHandler


Expand Down

0 comments on commit 3c2c71b

Please sign in to comment.