Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: get_cosine_schedule_with_warmup() got an unexpected keyword argument 'num_decay_steps' #2841

Open
AndreR opened this issue Sep 19, 2024 · 1 comment

Comments

@AndreR
Copy link

AndreR commented Sep 19, 2024

With the latest version, I get the mentioned error message. How can I fix that? See also #2820.

You are using the default legacy behaviour of the <class 'transformers.models.t5.tokenization_t5.T5Tokenizer'>. This is expected, and simply means that the `legacy` (previous) behavior will be used so nothing changes for you. If you want to use the new behaviour, set `legacy=False`. This should only be set if you understand what it means, and thoroughly read the reason why this was added as explained in https://github.com/huggingface/transformers/pull/24565
2024-09-19 19:08:59 INFO     Building CLIP                                                              flux_utils.py:74
                    INFO     Loading state dict from P:/AI/Flux Encoders/clip_l.safetensors            flux_utils.py:167
                    INFO     Loaded CLIP: <All keys matched successfully>                              flux_utils.py:170
                    INFO     Loading state dict from P:/AI/Flux Encoders/t5xxl_fp16.safetensors        flux_utils.py:215
                    INFO     Loaded T5xxl: <All keys matched successfully>                             flux_utils.py:218
                    INFO     Building Flux model dev                                                    flux_utils.py:45
                    INFO     Loading state dict from P:/AI/Flux Encoders/flux1-dev.safetensors          flux_utils.py:52
2024-09-19 19:09:00 INFO     Loaded Flux: <All keys matched successfully>                               flux_utils.py:55
                    INFO     enable block swap: double_blocks_to_swap=0, single_blocks_to_swap=0       flux_train.py:272
number of trainable parameters: 11901408320
prepare optimizer, data loader etc.
                    INFO     use 8-bit AdamW optimizer | {}                                           train_util.py:4383
override steps. steps for 2 epochs is / 指定エポックまでのステップ数: 320
Traceback (most recent call last):
  File "P:\AI\kohya_ss\sd-scripts\flux_train.py", line 908, in <module>
    train(args)
  File "P:\AI\kohya_ss\sd-scripts\flux_train.py", line 387, in train
    lr_scheduler = train_util.get_scheduler_fix(args, optimizer, accelerator.num_processes)
  File "P:\AI\kohya_ss\sd-scripts\library\train_util.py", line 4724, in get_scheduler_fix
    return schedule_func(
TypeError: get_cosine_schedule_with_warmup() got an unexpected keyword argument 'num_decay_steps'
Traceback (most recent call last):
  File "C:\Program Files\Python310\lib\runpy.py", line 196, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "C:\Program Files\Python310\lib\runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File "P:\AI\kohya_ss\venv\Scripts\accelerate.EXE\__main__.py", line 7, in <module>
  File "P:\AI\kohya_ss\venv\lib\site-packages\accelerate\commands\accelerate_cli.py", line 48, in main
    args.func(args)
  File "P:\AI\kohya_ss\venv\lib\site-packages\accelerate\commands\launch.py", line 1106, in launch_command
    simple_launcher(args)
  File "P:\AI\kohya_ss\venv\lib\site-packages\accelerate\commands\launch.py", line 704, in simple_launcher
    raise subprocess.CalledProcessError(returncode=process.returncode, cmd=cmd)
subprocess.CalledProcessError: Command '['P:\\AI\\kohya_ss\\venv\\Scripts\\python.exe', 'P:/AI/kohya_ss/sd-scripts/flux_train.py', '--config_file', 'P:/AI/LoRA/config_dreambooth-20240919-190842.toml']' returned non-zero exit status 1.

@maxanier
Copy link

Duplicate #2812. There is a workaround, but it is already fixed I think.
Maybe try the search function ;)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants