Skip to content

Commit

Permalink
feat: Creating Career AI repository
Browse files Browse the repository at this point in the history
  • Loading branch information
Luis Chaparro committed Aug 13, 2024
0 parents commit 50291e3
Show file tree
Hide file tree
Showing 15 changed files with 701 additions and 0 deletions.
1 change: 1 addition & 0 deletions .env.example
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
OPENAI_API_KEY=
166 changes: 166 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,166 @@
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class

# C extensions
*.so

# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST

# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec

# Installer logs
pip-log.txt
pip-delete-this-directory.txt

# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/
cover/

# Translations
*.mo
*.pot

# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal

# Flask stuff:
instance/
.webassets-cache

# Scrapy stuff:
.scrapy

# Sphinx documentation
docs/_build/

# PyBuilder
.pybuilder/
target/

# Jupyter Notebook
.ipynb_checkpoints

# IPython
profile_default/
ipython_config.py

# pyenv
# For a library or package, you might want to ignore these files since the code is
# intended to run in multiple environments; otherwise, check them in:
# .python-version

# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
#Pipfile.lock

# poetry
# Similar to Pipfile.lock, it is generally recommended to include poetry.lock in version control.
# This is especially recommended for binary packages to ensure reproducibility, and is more
# commonly ignored for libraries.
# https://python-poetry.org/docs/basic-usage/#commit-your-poetrylock-file-to-version-control
#poetry.lock

# pdm
# Similar to Pipfile.lock, it is generally recommended to include pdm.lock in version control.
#pdm.lock
# pdm stores project-wide configurations in .pdm.toml, but it is recommended to not include it
# in version control.
# https://pdm.fming.dev/latest/usage/project/#working-with-version-control
.pdm.toml
.pdm-python
.pdm-build/

# PEP 582; used by e.g. github.com/David-OConnor/pyflow and github.com/pdm-project/pdm
__pypackages__/

# Celery stuff
celerybeat-schedule
celerybeat.pid

# SageMath parsed files
*.sage.py

# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/

# Spyder project settings
.spyderproject
.spyproject

# Rope project settings
.ropeproject

# mkdocs documentation
/site

# mypy
.mypy_cache/
.dmypy.json
dmypy.json

# Pyre type checker
.pyre/

# pytype static type analyzer
.pytype/

# Cython debug symbols
cython_debug/

# PyCharm
# JetBrains specific template is maintained in a separate JetBrains.gitignore that can
# be found at https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore
# and can be added to the global gitignore or merged into this file. For a more nuclear
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
.idea/

tmp/*.png

.DS_Store
64 changes: 64 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
# Career AI

This is an application of Generative AI developed by the Wizeline team for the activities made in Bogotá, Colombia, with the students of San José de Castilla School and Villemar El Carmen School.

The application creates a story about the professional career of the students, based on their interests.

## Technology Stack :computer:

- Python: 3.9.6
- Gradio: 4.19.2
- Openai: 1.35.15
- Pydantic: 2.8.2
- Requests: 2.26.0

## Installation :wrench:

Create a virtual environment

```bash
python3 -m venv venv
```

Activate the virtual environment

```bash
source venv/bin/activate
```

Install the dependencies

```bash
pip install -r requirements.txt
```

Assign the value of the environment variable in the .env file

## Usage :rocket:

Run the application

```bash
python main.py
```

Access the application in your browser: http://localhost:7860/

You will see the following screen:

![image](resources/empty_fields.png)

Fill the fields with the information of the student and click the button "Submit".
The application will generate a story about the professional career of the student of today, in 5 years and in 15 years.

![image](resources/today_story.png)

![image](resources/some_years.png)

![image](resources/many_years.png)

## Demo :tv:

In this link, you can see a demo of the application: [Career AI](https://drive.google.com/file/d/1RuJClFehM0FUOrMpM6oq8HLValp0XKHF/view?usp=drive_link)

Cheers!
Empty file added app/__init__.py
Empty file.
51 changes: 51 additions & 0 deletions app/api_logic.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
from openai import AsyncOpenAI
from dotenv import load_dotenv

from app.prompts import SYSTEM_ROLE

load_dotenv()

client = AsyncOpenAI()

GENERAL_MESSAGE = [
{
"role": "system",
"content": SYSTEM_ROLE
}
]

COMMON_SETTINGS = {
'max_tokens': 500,
'top_p': 1,
'frequency_penalty': 0,
'presence_penalty': 0,
}


async def get_completion_text(prompt: str, model: str = 'gpt-4-turbo', temperature: float = 0.5) -> str:
messages = GENERAL_MESSAGE + [
{
'role': 'user',
'content': prompt
}
]

response = await client.chat.completions.create(
model=model,
messages=messages,
temperature=temperature,
**COMMON_SETTINGS,
)

return response.choices[0].message.content


async def get_completion_image(prompt: str, model: str = 'dall-e-3') -> str:
response = await client.images.generate(
model=model,
n=1,
prompt=prompt,
size='1024x1024',
)

return response.data[0].url
Loading

0 comments on commit 50291e3

Please sign in to comment.