Skip to content

Conversation

@calvin0327
Copy link
Contributor

@calvin0327 calvin0327 commented May 29, 2025

Since the PR #17214 was merged, a new issue has emerged, the prompt field of the RequestOutputs object is empty when returned form the request queue. this occurs because when generating the requestState object for each request, the prompt value returned by decoder_inputs.get("prompt") in function process_inputs is already empty.

Therefore, when creating the requestState object, we directly decode the prompt characters from the request's prompt_tokens_ids and store in the requestState object.

@github-actions
Copy link

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

🚀

@mergify mergify bot added the v1 label May 29, 2025
@calvin0327
Copy link
Contributor Author

@DarkLight1337 @njhill /cc

PTAL, thanks.

@calvin0327 calvin0327 force-pushed the fix-create-request-state branch from 3955bb0 to f747fe5 Compare May 29, 2025 06:11
Signed-off-by: calvin chen <120380290@qq.com>
@calvin0327 calvin0327 force-pushed the fix-create-request-state branch from f747fe5 to dd5a4b8 Compare May 29, 2025 06:16
@calvin0327
Copy link
Contributor Author

Test:
before:
image

after:
image

@njhill
Copy link
Member

njhill commented May 29, 2025

Thanks @calvin0327... to clarify, this is just for the case where only prompt token ids and not a prompt string is provided in the request?

@calvin0327
Copy link
Contributor Author

Thanks @calvin0327... to clarify, this is just for the case where only prompt token ids and not a prompt string is provided in the request?

@njhill Thanks for review,Regardless of whether the user provides prompt token IDs or a prompt string, the prompt field in the RequestOutput returned by AsyncLLM or LLMEngine will be missing. This is because there is an issue with parsing the prompt field when generating the requestState, resulting in it being empty.

@njhill
Copy link
Member

njhill commented May 31, 2025

Thanks @calvin0327... to clarify, this is just for the case where only prompt token ids and not a prompt string is provided in the request?

@njhill Thanks for review,Regardless of whether the user provides prompt token IDs or a prompt string, the prompt field in the RequestOutput returned by AsyncLLM or LLMEngine will be missing. This is because there is an issue with parsing the prompt field when generating the requestState, resulting in it being empty.

@calvin0327 I don't see how #17214 could have affected this though?

I don't think we want to reconstruct the prompt via detokenization if it was passed in the request. We should just make sure that the input processor obtains the original prompt properly (e.g. by fixing the parsing if needed).

queue = RequestOutputCollector(output_kind=params.output_kind)

# Convert Input --> Request.
prompt_str, request = self.processor.process_inputs(
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@njhill Yes, I agree with you that the best approach would be to parse the prompt string within this function, However, after examining the function, I couldn't find any logic that converts the prompt token ids into a prompt string, Do you have any better ideas?

@mergify
Copy link

mergify bot commented Jun 4, 2025

This pull request has merge conflicts that must be resolved before it can be
merged. Please rebase the PR, @calvin0327.

https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/syncing-a-fork

@mergify mergify bot added the needs-rebase label Jun 4, 2025
@github-actions
Copy link

github-actions bot commented Sep 3, 2025

This pull request has been automatically marked as stale because it has not had any activity within 90 days. It will be automatically closed if no further activity occurs within 30 days. Leave a comment if you feel this pull request should remain open. Thank you!

@github-actions github-actions bot added the stale Over 90 days of inactivity label Sep 3, 2025
@github-actions
Copy link

github-actions bot commented Oct 4, 2025

This pull request has been automatically closed due to inactivity. Please feel free to reopen if you intend to continue working on it. Thank you!

@github-actions github-actions bot closed this Oct 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

needs-rebase stale Over 90 days of inactivity v1

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants