Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Master optimum-intel for llava #1076

Draft
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

Wovchena
Copy link
Collaborator

No description provided.

@github-actions github-actions bot added category: sampling Sampling / Decoding algorithms category: GHA CI based on Github actions labels Oct 25, 2024
@ilya-lavrenov
Copy link
Contributor

Looks like safe to merge?

@ilya-lavrenov ilya-lavrenov self-assigned this Oct 28, 2024
@Wovchena
Copy link
Collaborator Author

The hope was that if would fix llava-next output. But it didn't. So I don't care if all the models are tested with a fork or some of them are tested with upstream.

@@ -725,8 +725,7 @@ jobs:
python -m pip install --upgrade-strategy eager -r ./samples/requirements.txt opencv-python --pre --extra-index-url https://storage.openvinotoolkit.org/simple/wheels/nightly
- name: Download and convert MiniCPM-V-2_6 model and an image
run: |
python -m pip install git+https://github.com/eaidova/optimum-intel.git@ea/minicpmv
python -m pip install -U "optimum<1.23" --no-dependencies
python -m pip install git+https://github.com/eaidova/optimum-intel.git@ea/minicpmv --no-dependencies
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this branch was merged to main recently

@ilya-lavrenov ilya-lavrenov added this to the 2025.0 milestone Nov 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
category: GHA CI based on Github actions category: sampling Sampling / Decoding algorithms
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants