Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Custom GraphGym config not working #5211

Open
M-Lampert opened this issue Aug 16, 2022 · 11 comments
Open

Custom GraphGym config not working #5211

M-Lampert opened this issue Aug 16, 2022 · 11 comments

Comments

@M-Lampert
Copy link
Contributor

🐛 Describe the bug

Registering custom configs in GraphGym does not work. It is already not possible to access the custom configs that are specified in the example. To reproduce:

  1. Clone PyG from Master
  2. Try to access the example custom configs in graphgym\main.py by adding the following after line 31:
    print(cfg.example_arg)
  1. Run run_single.sh
    If I do this I get the following error:
Traceback (most recent call last):
  File "...\pytorch_geometric\graphgym\main.py", line 32, in <module>
    print(cfg.example_arg)
  File "...\.conda\envs\test\lib\site-packages\yacs\config.py", line 141, in __getattr__
    raise AttributeError(name)
AttributeError: example_arg

Environment

  • PyG version: 2.1.0
  • PyTorch version: 1.12.1
  • OS: Windows 11
  • Python version: 3.9.12
  • CUDA/cuDNN version: Running on CPU
  • How you installed PyTorch and PyG (conda, pip, source): via pip install git+https://github.com/pyg-team/pytorch_geometric
@M-Lampert M-Lampert added the bug label Aug 16, 2022
@rusty1s
Copy link
Member

rusty1s commented Aug 17, 2022

Thanks for reporting. I think this is fully intentional and the yacs library we are using internally does not support this either. We only want users to specify config parameters that GraphGym uses internally. What would the use-case of this?

@M-Lampert
Copy link
Contributor Author

Sorry, maybe my minimal working example was a bit too minimal.
I want to write a custom encoder with configurations that can then be configured using the .yaml-file. But if I try to set a new value for a custom config the following error occurs (here I tried to set a new value for the example custom config given here by adding a new line containing example_arg: test to the .yaml-file):

Traceback (most recent call last):
  File "C:\Users\morit\workspace\pytorch_geometric\graphgym\main.py", line 27, in <module>
    load_cfg(cfg, args)
  File "C:\Users\morit\.conda\envs\test\lib\site-packages\torch_geometric\graphgym\config.py", line 503, in load_cfg
    cfg.merge_from_file(args.cfg_file)
  File "C:\Users\morit\.conda\envs\test\lib\site-packages\yacs\config.py", line 213, in merge_from_file
    self.merge_from_other_cfg(cfg)
  File "C:\Users\morit\.conda\envs\test\lib\site-packages\yacs\config.py", line 217, in merge_from_other_cfg
    _merge_a_into_b(cfg_other, self, self, [])
  File "C:\Users\morit\.conda\envs\test\lib\site-packages\yacs\config.py", line 491, in _merge_a_into_b
    raise KeyError("Non-existent config key: {}".format(full_key))
KeyError: 'Non-existent config key: example_arg'

and when I try to access it for example in a custom encoder:

import torch

from torch_geometric.graphgym.config import cfg
from torch_geometric.graphgym.register import register_node_encoder


@register_node_encoder('example')
class ExampleNodeEncoder(torch.nn.Module):
    def __init__(self, emb_dim, num_classes=None):
        super().__init__()
        
        # Some dummy code to throw the error
        self.example = cfg.example_arg

        self.encoder = torch.nn.Embedding(num_classes, emb_dim)
        torch.nn.init.xavier_uniform_(self.encoder.weight.data)

    def forward(self, batch):
        # Encode just the first dimension if more exist
        batch.x = self.encoder(batch.x[:, 0])

        return batch

I get a similar error as above.

I thought this is what the custom configs are for or did I misunderstand something?

@rusty1s
Copy link
Member

rusty1s commented Aug 19, 2022

I think you need to register the new attribute in cfg as well. For example:

@register_node_encoder('example')
class ExampleNodeEncoder(torch.nn.Module):
    pass

cfg.example_arg = default_value

@M-Lampert
Copy link
Contributor Author

That works, thank you. But then what is a use case for config/example.py? I tested it and it also works just by specifying it only where you suggested.

By the way: While testing this I found another bug and tried to fix it here: #5243

@rusty1s
Copy link
Member

rusty1s commented Aug 20, 2022

Oh, you are right. You can also register a new config and initialize cfg parameters. I think both approaches work fine here. @JiaxuanYou can give more insights on which way is preferred.

@do-lania
Copy link

I just ran into the same issue. I am trying to create custom config args to specify in the yaml file, so that I can also use these custom configs in my other custom graphgym modules.

It does seem like these custom configs are supposed to be set in lines 448-450 in torch_geometric/graphgym/config.py in set_cfg():

# Set user customized cfgs
for func in register.config_dict.values():
    func(cfg)

However, it doesn't seem to be working as intended because the first time set_cfg() is run, register.config_dict is empty. This is due to the fact that importing register_config first goes to torch_geometric/graphgym/__init__.py and imports a bunch of other modules, which import configs first and, therefore, initialize cfg without ever setting user defined configs.

I was able to fix this issue by running set_cfg(cfg) again in my main before running load_cfg(cfg, args).

@rusty1s
Copy link
Member

rusty1s commented Oct 15, 2022

@JiaxuanYou Can you take a look?

@do-lania
Copy link

A bit more on this: turns out running set_cfg(cfg) before load_cfg(cfg, args) only solves part of the problem. It is still not possible to create custom modules with custom configs, because these configs are still not available during registering these modules. For example, if I'm trying to create a custom activation function:

from functools import partial

import torch.nn as nn

from torch_geometric.graphgym.config import cfg
from torch_geometric.graphgym.register import register_act

class CustomActivation(nn.Module):

    def __init__(self, custom_arg):
         super().__init__()
         self.custom_arg = custom arg

    def forward(self, x):
        ...

register_act("custom_act", partial(CustomAct, custom_arg=cfg.custom_act_arg))

This will not work, because custom_act_arg does not yet exist.

@M-Lampert
Copy link
Contributor Author

I solved the issue by creating all my custom configs in the module's __init__.py. In your example, I would create it in torch_geometric/graphgym/custom_graphgym/act/__init__.py. Then it would exist when you try to use it in your example, but when doing it like this the torch_geometric/graphgym/custom_graphgym/config module has no use anymore. I guess it could be removed altogether before anyone else gets confused.

@mdanb
Copy link

mdanb commented May 30, 2023

Any updates on this? For me what worked (partially) was to run set_cfg(cfg) as @do-lania suggested

@FDUguchunhui
Copy link

Any updates on this? For me what worked (partially) was to run set_cfg(cfg) as @do-lania suggested

The current solution I have is to call the registered configuration in the main.py manually. It only needs to be done once, considering the custom configuration.

@register_config('example')
def set_cfg(cfg):
    # ----------------------------------------------------------------------- #
    # Customized options
    # ----------------------------------------------------------------------- #
    cfg.run.repeat = 1
    cfg.run.name = None
    cfg.run.mark_down = False

in main.py

args = parser.parse_args()
# load additional custom config
register.config_dict.get('example')(cfg)
# override cfg with some cmd line args from opts
load_cfg(cfg, args)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

6 participants