Skip to content

[Train] RunConfig doesn't get propagated from the Tuner to the Trainer #33539

@justinvyu

Description

@justinvyu

What happened + What you expected to happen

When specifying RunConfig through the Tuner and not the Trainer, the Trainer does not have access to the run config later on.

We should either:

  1. Propagate the Tuner's run config down to the trainer.
  2. At least initialize the Trainer with a default run config and make it clear in the docs that you can't access these properties in a Trainer.

cc: @gjoliver

Versions / Dependencies

2.3.0

Reproduction script

from ray.train.torch import TorchTrainer
from ray.air.config import ScalingConfig, RunConfig
from ray.tune import Tuner


class MyTrainer(TorchTrainer):
    def training_loop(self) -> None:
        assert self.run_config.local_dir, "dude ... !"

trainer = MyTrainer(
    train_loop_per_worker=lambda: None,
    scaling_config=ScalingConfig(num_workers=2)
)

tuner = Tuner(trainer, run_config=RunConfig(local_dir="hi/jun"))

tuner.fit()

Issue Severity

Low: It annoys or frustrates me.

Metadata

Metadata

Assignees

No one assigned

    Labels

    P2Important issue, but not time-criticalbugSomething that is supposed to be working; but isn'tpending-cleanupThis issue is pending cleanup. It will be removed in 2 weeks after being assigned.ray-team-createdRay Team createdtrainRay Train Related IssuetuneTune-related issues

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions