-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use global_step
as the x-axis for wandb
#558
Conversation
TensorboardLogger also has this function. So I think that's fine to create something like class WandbLogger(TensorboardLogger):
def __init__(self, *args, **kwargs):
wandb.init(..., sync_tensorboard=True)
super().__init__(*args, **kwargs) |
There is an issue with this approach. The |
How about this: # logger/wandb_init.py
import wandb
wandb.init(..., sync_tensorboard=True) # logger/wandb.py
# from tianshou.utils.logger import wandb_init
from tianshou.utils.logger.tensorboard import TensorboardLogger
class WandbLogger(TensorboardLogger):
pass # utils/__init__.py
do not import wandb_init here and in main.py: ...
from tianshou.utils.logger import wandb_init, WandbLogger
...
if __name__ == "__main__":
... |
This is the way CleanRL does it: |
Yeah, I mean we can replace this functionality with a simple import. |
XD didn't complete my message. I was writing but it can't be easily applied here because the logger has other utilities like save and resume data. How about something like if args.logger == "wandb":
logger = WandbLogger(
save_interval=1,
name=log_name,
run_id=args.resume_id,
config=args,
)
writer = SummaryWriter(log_path)
writer.add_text("args", str(args))
if args.logger == "wandb":
logger.load(writer) and in the |
Per conversation with @Trinkle23897, the latest code adopts the following style. if args.logger == "wandb":
logger = WandbLogger(
save_interval=1,
name=f"{args.task}__{log_name}__{args.seed}__{int(time.time())}",
run_id=args.resume_id,
config=args,
)
writer = SummaryWriter(log_path)
writer.add_text("args", str(args))
if args.logger == "tensorboard":
logger = TensorboardLogger(writer)
if args.logger == "wandb":
logger.load(writer) https://wandb.ai/costa-huang/tianshou/runs/uktkei7h?workspace=user-costa-huang tracks this run. |
Codecov Report
@@ Coverage Diff @@
## master #558 +/- ##
==========================================
- Coverage 93.88% 93.85% -0.04%
==========================================
Files 64 64
Lines 4368 4376 +8
==========================================
+ Hits 4101 4107 +6
- Misses 267 269 +2
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Everything looks good except I think we should add algo_name
to the args variable. Also, do you have a tracked run?
All good on my end |
* Use `global_step` as the x-axis for wandb * Use Tensorboard SummaryWritter as core with `wandb.init(..., sync_tensorboard=True)` * Update all atari examples with wandb Co-authored-by: Jiayi Weng <trinkle23897@gmail.com>
make format
(required)make commit-checks
(required)Tianshou already supports W&B logging via #426. The current logging solution uses two custom x-axises
train/env_step
andtest/env_step
. Such usage might be less desirable becausetrain/env_step
andtest/env_step
share virtually the same values, so we should use the same key such asglobal_step
; withglobal_step
as the x-axis we can still see thetrain/reward
andtest_reward
as the y-axis,global_step
as the common x-axis (see Support experiment tracking with W&B DLR-RM/rl-baselines3-zoo#213).To help address this issue, this PR uses
global_step
as the x-axis for wandb logging. Additionally, this PR allows the users to override the default wandb project via environment variables like:Alternatives considered
An alternative plan is to remove the
WandbLogger
altogether and instead use wandb's tensorboard integration likeWhile this is possible,
WandbLogger
currently does more such as resume training, so removing it is a bit more complicated.