Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Frontend]-config-cli-args #7737

Merged
merged 34 commits into from
Aug 30, 2024
Merged
Show file tree
Hide file tree
Changes from 4 commits
Commits
Show all changes
34 commits
Select commit Hold shift + click to select a range
0d304d7
[Frontend]-config-cli-args
KaunilD Aug 21, 2024
f36dc39
Update vllm/scripts.py
KaunilD Aug 21, 2024
2bca2fa
[Frontend]-config-cli-args
KaunilD Aug 21, 2024
ab570d1
[Frontend]-config-cli-args updated docs
KaunilD Aug 21, 2024
7bfc6cb
[Frontend]-config-cli-args updated docs
KaunilD Aug 22, 2024
1779536
Update docs/source/serving/openai_compatible_server.md
KaunilD Aug 23, 2024
ff93954
[Frontend]-config-cli-args integrated configargparse
Aug 23, 2024
a9492c4
[Frontend]-config-cli-args removed comfig.yaml
Aug 23, 2024
05164e0
[Frontend]-config-cli-args removed comfig.yaml
Aug 23, 2024
7014a4e
[Frontend]-config-cli-args renamed function signature
Aug 23, 2024
63413aa
[Frontend]-config-cli-args formattting
Aug 23, 2024
4d6f930
[Frontend]-config-cli-args added native support
Aug 26, 2024
0d41c4c
[Frontend]-config-cli-args added native support
Aug 26, 2024
8d84671
[Frontend]-config-cli-args added native support
Aug 26, 2024
c5af059
[Frontend]-config-cli-args added native support
Aug 26, 2024
f6529e3
[Frontend]-config-cli-args added native support
Aug 26, 2024
6e1fe11
[Frontend]-config-cli-args added tests
Aug 26, 2024
175a0d5
[Frontend]-config-cli-args added tests
Aug 26, 2024
7c06e17
[Frontend]-config-cli-args added tests
Aug 26, 2024
98208b3
[Frontend]-config-cli-args added tests
Aug 26, 2024
3d72a70
[Frontend]-config-cli-args updated tests
KaunilD Aug 27, 2024
10054a4
[Frontend]-config-cli-args updated tests
KaunilD Aug 27, 2024
b87593b
[Frontend]-config-cli-args updated tests
KaunilD Aug 27, 2024
56a7054
[Frontend]-config-cli-args thinned diff
KaunilD Aug 27, 2024
2c7df07
[Frontend]-config-cli-args thinned diff
KaunilD Aug 27, 2024
7b77458
Merge branch 'main' into kaunild/frontend/config-cli-args
KaunilD Aug 27, 2024
a5b1a3a
[Frontend]-config-cli-args updated tests
KaunilD Aug 27, 2024
d189970
Update vllm/utils.py
KaunilD Aug 30, 2024
ae178fb
Update vllm/utils.py
KaunilD Aug 30, 2024
0c1b302
Update docs/source/serving/openai_compatible_server.md
KaunilD Aug 30, 2024
743aee5
Merge branch 'main' into kaunild/frontend/config-cli-args
KaunilD Aug 30, 2024
295f675
[Frontend]-config-cli-args
Aug 30, 2024
960b047
Merge branch 'kaunild/frontend/config-cli-args' of github.com:KaunilD…
Aug 30, 2024
b6f130d
[Frontend]-config-cli-args
Aug 30, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
23 changes: 23 additions & 0 deletions docs/source/serving/openai_compatible_server.md
Original file line number Diff line number Diff line change
Expand Up @@ -111,6 +111,29 @@ directory [here](https://github.com/vllm-project/vllm/tree/main/examples/)
:prog: vllm serve
```

### Config file

You may also supply these CLI args using a config file. For example:

```yaml
# config.yaml

host: "127.0.0.1"
port: 6379
uvicorn-log-level: "info"
```

```bash
$ vllm serve --model SOME_MODEL --config config.yaml
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

--model SOME_MODEL can be in the config.yaml as well?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done. edited the doc.

```

---
**NOTE**

In case an argument is supplied using CLI and the config file, the value from the config file will take precedence.

---

## Tool calling in the chat completion API
vLLM supports only named function calling in the chat completion API. The `tool_choice` options `auto` and `required` are **not yet supported** but on the roadmap.

Expand Down
17 changes: 17 additions & 0 deletions vllm/scripts.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
# The CLI entrypoint to vLLM.
import argparse
import asyncio
import configparser
import os
import signal
import sys
Expand All @@ -25,6 +26,13 @@ def signal_handler(sig, frame):


def serve(args: argparse.Namespace) -> None:
if args.config:
config_parser = configparser.ConfigParser()
config_parser.read(args.config)

for key, value in config_parser.items():
setattr(args, key, value)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if we have --config, make sure we don't have any other commandline args?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i like that exclusivity as well. let me patch it.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

wait, after a second though, it is not clear to me if we should make it mutually exclusive

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

for example. if I have a config that has lots of options, I might want to simply reuse it and override some values from the commandline, e.g. vllm serve --config config.yaml -tp 2

Copy link
Contributor Author

@KaunilD KaunilD Aug 21, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i see what you mean, i had the same picture in mind as well.

@youkaichao howa bout this?

lets not make it exclusive to allow flexibility, BUT also raise an exception in case an option is specified in both cli and config files. this will prevent users from unknowingly launching their server with stale configs/args...

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

or print an INFO level logging, telling users explicitly that some arg value from commandline takes precedence ?

Copy link
Contributor Author

@KaunilD KaunilD Aug 21, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

deal 🤝.. let me go home and patch this.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done. made cli the override

# The default value of `--model`
if args.model != EngineArgs.model:
raise ValueError(
Expand Down Expand Up @@ -125,6 +133,15 @@ def main():
serve_parser.add_argument("model_tag",
type=str,
help="The model tag to serve")
serve_parser.add_argument(
"--config",
type=str,
required=False,
default='',
help="Read CLI options from a config file."
"Must be a YAML with the following options:"
"https://docs.vllm.ai/en/latest/serving/openai_compatible_server.html#command-line-arguments-for-the-server"
)
serve_parser = make_arg_parser(serve_parser)
serve_parser.set_defaults(dispatch_function=serve)

Expand Down
Loading