Skip to content

Add stream_chat and conditionally set AutoModelClass to MllamaForConditionalGeneration #22547

Add stream_chat and conditionally set AutoModelClass to MllamaForConditionalGeneration

Add stream_chat and conditionally set AutoModelClass to MllamaForConditionalGeneration #22547

Workflow file for this run

name: Linting
on:
push:
branches:
- main
pull_request:
env:
POETRY_VERSION: "1.8.3"
jobs:
build:
runs-on: ubuntu-latest
strategy:
# You can use PyPy versions in python-version.
# For example, pypy-2.7 and pypy-3.8
matrix:
python-version: ["3.9"]
steps:
- uses: actions/checkout@v4
with:
fetch-depth: ${{ github.event_name == 'pull_request' && 2 || 0 }}
- name: Install Poetry
run: pipx install poetry==${{ env.POETRY_VERSION }}
- name: Set up python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
cache: "poetry"
cache-dependency-path: "**/poetry.lock"
- name: Install pre-commit
shell: bash
run: poetry run pip install pre-commit
- name: Run linter
shell: bash
run: poetry run make lint