Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

maestro study not launching? #254

Closed
doutriaux1 opened this issue Apr 29, 2020 · 22 comments
Closed

maestro study not launching? #254

doutriaux1 opened this issue Apr 29, 2020 · 22 comments
Assignees
Labels
bug Description of reproducible unexpected behavior.

Comments

@doutriaux1
Copy link
Collaborator

Here is the yaml I'm trying to run. It expands correctly and when I they "y" to launch the study everything look correct. But it appears nothing is launched a meastro status does not even show anything

Launch output:

maestro run -p maestro_custom_generator.py maestro_bug.yaml
2020-04-29 08:09:26,637 - maestrowf.maestro:setup_logging:446 - INFO - INFO Logging Level -- Enabled
2020-04-29 08:09:26,637 - maestrowf.maestro:setup_logging:447 - WARNING - WARNING Logging Level -- Enabled
2020-04-29 08:09:26,637 - maestrowf.maestro:setup_logging:448 - CRITICAL - CRITICAL Logging Level -- Enabled
2020-04-29 08:09:26,642 - maestrowf.maestro:load_parameter_generator:105 - INFO - Loading custom parameter generator from '/usr/WS1/aml_cs/ALE/LAGER/data-generation/Hohlraum/maestro_custom_generator.py'
2020-04-29 08:09:28,055 - maestrowf.datastructures.core.study:__init__:199 - INFO - OUTPUT_PATH = /usr/WS1/aml_cs/ALE/LAGER/data-generation/Hohlraum/generate_hohlraum_20200429-080926
2020-04-29 08:09:28,055 - maestrowf.datastructures.core.study:add_step:339 - INFO - Adding step 'kosh' to study 'generate_hohlraum'...
2020-04-29 08:09:28,055 - maestrowf.datastructures.core.study:add_step:339 - INFO - Adding step 'directory_permissions' to study 'generate_hohlraum'...
2020-04-29 08:09:28,056 - maestrowf.datastructures.core.study:setup_workspace:379 - INFO - Setting up study workspace in '/usr/WS1/aml_cs/ALE/LAGER/data-generation/Hohlraum/generate_hohlraum_20200429-080926'
2020-04-29 08:09:28,056 - maestrowf.datastructures.core.study:setup_environment:389 - INFO - Environment is setting up.
2020-04-29 08:09:28,056 - maestrowf.datastructures.core.studyenvironment:acquire_environment:191 - INFO - Acquiring dependencies
2020-04-29 08:09:28,056 - maestrowf.datastructures.core.study:configure_study:417 - INFO -
------------------------------------------
Output path =               /usr/WS1/aml_cs/ALE/LAGER/data-generation/Hohlraum/generate_hohlraum_20200429-080926
Submission attempts =       1
Submission restart limit =  1
Submission throttle limit = 0
Use temporary directory =   False
Hash workspaces =           False
------------------------------------------
2020-04-29 08:09:28,056 - maestrowf.datastructures.core.executiongraph:__init__:346 - INFO -
------------------------------------------
Submission attempts =       1
Submission throttle limit = 0
Use temporary directory =   False
Tmp Dir =
------------------------------------------
2020-04-29 08:09:28,056 - maestrowf.datastructures.core.executiongraph:log_description:518 - INFO -
==================================================
name: generate_hohlraum
description: Runs Hohlraum Simulation
==================================================

2020-04-29 08:09:28,056 - maestrowf.datastructures.core.study:_stage:440 - INFO -
==================================================
Constructing parameter study 'generate_hohlraum'
==================================================

2020-04-29 08:09:28,056 - maestrowf.datastructures.core.study:_stage:466 - INFO -
[snip]
2020-04-29 08:09:28,070 - maestrowf.utils:create_parentdir:101 - INFO - Directory does not exist. Creating directories to /usr/WS1/aml_cs/ALE/LAGER/data-generation/Hohlraum/generate_hohlraum_20200429-080926/meta
2020-04-29 08:09:28,073 - maestrowf.utils:create_parentdir:101 - INFO - Directory does not exist. Creating directories to /usr/WS1/aml_cs/ALE/LAGER/data-generation/Hohlraum/generate_hohlraum_20200429-080926/meta/study
Would you like to launch the study? [yn] y
Study launched successfully.

status (nothing):

(kosh) [cdoutrix@rztopaz188:Hohlraum]$ maestro status generate_hohlraum_20200429-080926
(kosh) [cdoutrix@rztopaz188:Hohlraum]$
description:
  name: generate_hohlraum
  description: Runs Hohlraum Simulation

env:
  labels:
      RUN_PATH: /usr/workspace/aml_cs/ALE/Hohlraum
      SCRIPTS_DIR: /usr/workspace/aml_cs/ALE/LAGER/data-generation/Hohlraum
      MASH_CONFIG_FILE: mash_kull_config.ascii
      ALEEVERY: 1
      MOVIE_VARIABLES: Material Pressure Temperature
      PROCESSORS_PER_NODE: 36
      PROCESSORS_PER_DOMAIN: 4
batch:
  type:  slurm
  queue: pbatch
  host:  rztopaz
  bank:  wbronze

generator.parameters:
  RHO_FOAM2:
    values: [ 2.e-6, 0.350, ]
    label: foamDensity_%%
  LASPOWERMULT:
    values: [ 1.0, ]
    label: laserPowerMult_%%
  RESOLUTION:
    values: [ .15, .25, .5, ]
    label: meshResolution_%%
  DOMAIN:
    values: [ 18, 72, 288, ]
    label: domain_%%
  MINIMALALE:
    values: [ 0, 1 ]
    label: minimalAle_%%
  LINKED:
    RESOLUTION: [ "DOMAIN",]

study:
      - name: kosh
        description: add simulation to kosh
        run:
          cmd: |
            echo $(PROC).$(NODES).$(DOMAIN).$(RESOLUTION).$(RHO_FOAM2).$(LASPOWERMULT).$(MINIMALALE).$(PROCS_XENA).$(NODES_XENA)
            export RUNDIR=HWH_$( basename $(WORKSPACE))
            python $(SCRIPTS_DIR)/add_to_kosh.py --store=/usr/workspace/aml_cs/kosh/kosh_store.sql --root $(RUN_PATH) -n $RUNDIR
          depends: []
      
      - name: directory_permissions
        description: fix directory permissions
        run:
          cmd: |
            echo $(PROC).$(NODES).$(DOMAIN).$(RESOLUTION).$(RHO_FOAM2).$(LASPOWERMULT).$(MINIMALALE).$(PROCS_XENA).$(NODES_XENA)
            export RUNDIR=HWH_$( basename $(WORKSPACE))
            find $(RUN_PATH)/$RUNDIR  -type f -exec chmod g+r  {} +
            find $(RUN_PATH)/$RUNDIR  -type d -exec chmod g+x  {} +
            chgrp -R aml_cs $(run_PATH)/$RUNDIR
          depends: []

custom generator:

import sys
from maestrowf.datastructures.core import ParameterGenerator
from sklearn.model_selection import ParameterGrid
import yaml
try:
  from yaml import CLoader as Loader, CDumper as Dumper
except ImportError:
  from yaml import Loader, Dumper


def compute_nodes(number_procs, proc_per_node):
    if number_procs % proc_per_node == 0:
      add_node = 0
    else:
      add_node = 1
    return number_procs // proc_per_node + add_node

def compute_xena_nodes_and_procs(number_domains, proc_per_node):
    # Hard code to 4 domains per processor and use every processor on the node
    domains_per_proc = 4
    domains_per_node = proc_per_node*domains_per_proc
    add_node = 0
    if number_domains%domains_per_node != 0 :
      add_node = 1
    nodes = number_domains // domains_per_node + add_node
    # Use every processor on the nodes, up to the number of domains
    procs = min( number_domains, proc_per_node*nodes )
    return ( nodes, procs )
  

def get_custom_generator(env, **kwargs):
  """
Create a custom populated ParameterGenerator.

This function recreates the exact same parameter set as the sample LULESH
specifications. The point of this file is to present an example of how to
generate custom parameters.

:returns: A ParameterGenerator populated with parameters.
"""
  import sys
  p_gen = ParameterGenerator()
  yml = yaml.load(open(sys.argv[-1]).read(), Loader=Loader)
  p_in = {}
  labels = {}

  linked_ps = {}
  linked = {}

  for k, val in yml["generator.parameters"].items():
    if "values" in val:
        if isinstance(val["values"], (list,tuple)):
            p_in[k] = list(val["values"])
        else:
            p_in[k] = [val["values"],]
        labels[k] = val["label"]
    elif k == "LINKED":
        linked = val

  for plink in linked:
    for link in linked[plink]:
        if not plink in linked_ps:
            linked_ps[plink] = {}
        linked_ps[plink][link] = p_in.pop(link)

  grid = ParameterGrid(p_in)
  p = {}
  for g in grid:
    for k in g:
        if k not in p:
            p[k] = [g[k], ]
            if k in linked_ps:
                for link in linked_ps[k]:
                    p[link] = [linked_ps[k][link][p_in[k].index(g[k])],]
        else:
            p[k].append(g[k])
            if k in linked_ps:
                for link in linked_ps[k]:
                    p[link].append(linked_ps[k][link][p_in[k].index(g[k])])

  # now we have some magic to do:
  # first use global env to figure number of procs per node
  proc_per_node = yml['env']['labels']['PROCESSORS_PER_NODE']
  proc_per_domain = yml['env']['labels']['PROCESSORS_PER_DOMAIN']

  p["PROC"] = []
  p["NODES"] = []
  p['PROCS_XENA'] = []
  p['NODES_XENA'] = []

  for i, d in enumerate(p["DOMAIN"]):
      p["PROC"].append(d*proc_per_domain)
      p["NODES"].append(compute_nodes(p["PROC"][-1], proc_per_node))
      # Xena can create sub domains for every processor.  Create 4 domains
      # per processor to reduce the node count needed.
      xena_nodes_and_procs = compute_xena_nodes_and_procs(d,proc_per_node)
      p["NODES_XENA"].append(xena_nodes_and_procs[0])
      p["PROCS_XENA"].append(xena_nodes_and_procs[1])

  labels["PROC"] = "PROC_%%"
  labels["NODES"] = "NODES_%%"
  labels["NODES_XENA"] = "NODES_XENA_%%"
  labels["PROCS_XENA"] = "PROCS_XENA_%%"
  for k, val in p.items():
    p_gen.add_parameter(k, val, labels[k])
  return p_gen
@FrankD412 FrankD412 self-assigned this Apr 29, 2020
@FrankD412 FrankD412 added the bug Description of reproducible unexpected behavior. label Apr 29, 2020
@FrankD412
Copy link
Member

@doutriaux1 -- What version of Maestro are you using?

@doutriaux1
Copy link
Collaborator Author

maestro 1.1.7dev1

@doutriaux1
Copy link
Collaborator Author

@FrankD412 I have the same version of the yaml but with more jobs to be run before that works great.

One thing is now I do not have any slurm jobs anymore.

C.

@FrankD412
Copy link
Member

FrankD412 commented Apr 29, 2020

If you pulled recently, there was a shift in where expansion happens. Expansion no longer happens before Maestro exits, instead happening in the conductor. How many nodes does your graph have?

Also, what does the Maestro generated directory look like?

@doutriaux1
Copy link
Collaborator Author

ll -rt generate_hohlraum_20200429-080926
total 60K
drwx--S---  2 cdoutrix aml_cs 4.0K Apr 29 08:09 logs
-rw-------  1 cdoutrix aml_cs 3.4K Apr 29 08:09 maestro_custom_generator.py
drwx--S---  3 cdoutrix aml_cs 4.0K Apr 29 08:09 meta
-rw-------  1 cdoutrix aml_cs 1.8K Apr 29 08:09 maestro_bug.yaml
-rw-------  1 cdoutrix aml_cs  25K Apr 29 08:09 generate_hohlraum.pkl
-rw-------  1 cdoutrix aml_cs   22 Apr 29 08:09 generate_hohlraum.txt
drwx--S--- 14 cdoutrix aml_cs 8.0K Apr 29 08:10 directory_permissions
drwx--S---  3 cdoutrix aml_cs 4.0K Apr 29 08:10 kosh

@doutriaux1
Copy link
Collaborator Author

My git log says:

git log -n1
commit 7022c4370cae8070e4632a423b78298782f3cabb
Author: Francesco Di Natale <dinatale3@llnl.gov>
Date:   Tue Apr 14 11:28:38 2020 -0700

    Bugfix for logging that didn't appear in submodules (#247)

    * Improved logging setup.

    * Transition to a LoggerUtil class.

    * Addition of docstring to LoggerUtility + cleanup.

@FrankD412
Copy link
Member

FrankD412 commented Apr 29, 2020

Ok, you're using a more recent version which means my previous comments apply. I did notice that you were missing some keys if you're intending on scheduling steps; you'll need extra keys in your steps. Based off of your parameter generator, I think this is what you're after:

- name: kosh
        description: add simulation to kosh
        run:
          cmd: |
            echo $(PROC).$(NODES).$(DOMAIN).$(RESOLUTION).$(RHO_FOAM2).$(LASPOWERMULT).$(MINIMALALE).$(PROCS_XENA).$(NODES_XENA)
            export RUNDIR=HWH_$( basename $(WORKSPACE))
            python $(SCRIPTS_DIR)/add_to_kosh.py --store=/usr/workspace/aml_cs/kosh/kosh_store.sql --root $(RUN_PATH) -n $RUNDIR
          nodes: $(NODE)
          procs: $(PROC)
          walltime: <walltime for this step>
          depends: []

Are there any exceptions at the end of the log file in logs?

@doutriaux1
Copy link
Collaborator Author

@FrankD412 keys should be generated by the custom generator, but you're right something is gone, the logg says:

ed?: False
2020-04-29 08:10:24,970 - maestrowf.interfaces.script.localscriptadapter:submit:153 - WARNING - Execution returned an error: /usr/WS1/aml_cs/ALE/LAGER/data-generation/Hohlraum/generate_hohlraum_20200429-080926/directory_permissions/domain_288.laserPowerMult_1.0.minimalAle_1.NODES_32.NODES_XENA_2.PROC_1152.PROCS_XENA_72.meshResolution_0.5.foamDensity_2e-06/directory_permissions_domain_288.laserPowerMult_1.0.minimalAle_1.
NODES_32.NODES_XENA_2.PROC_1152.PROCS_XENA_72.meshResolution_0.5.foamDensity_2e-06.slurm.sh: line 7: run_PATH: command not found
chgrp: cannot access '/HWH_domain_288.laserPowerMult_1.0.minimalAle_1.NODES_32.NODES_XENA_2.PROC_1152.PROCS_XENA_72.meshResolution_0.5.foamDensity_2e-06': No such file or directory

@doutriaux1
Copy link
Collaborator Author

Actually it's in the directory which are all generated, but only one kosh is generated:

ll -rt generate_hohlraum_20200429-080926/kosh
total 4.0K
drwx--S--- 2 cdoutrix aml_cs 4.0K Apr 29 08:10 domain_18.laserPowerMult_1.0.minimalAle_0.NODES_2.NODES_XENA_1.PROC_72.PROCS_XENA_18.meshResolution_0.15.foamDensity_2e-06
(kosh) [cdoutrix@rztopaz188:Hohlraum]$ ll -rt generate_hohlraum_20200429-080926/directory_permissions
total 48K
drwx--S--- 2 cdoutrix aml_cs 4.0K Apr 29 08:09 domain_18.laserPowerMult_1.0.minimalAle_0.NODES_2.NODES_XENA_1.PROC_72.PROCS_XENA_18.meshResolution_0.15.foamDensity_2e-06
drwx--S--- 2 cdoutrix aml_cs 4.0K Apr 29 08:09 domain_18.laserPowerMult_1.0.minimalAle_0.NODES_2.NODES_XENA_1.PROC_72.PROCS_XENA_18.meshResolution_0.15.foamDensity_0.35
drwx--S--- 2 cdoutrix aml_cs 4.0K Apr 29 08:09 domain_72.laserPowerMult_1.0.minimalAle_0.NODES_8.NODES_XENA_1.PROC_288.PROCS_XENA_36.meshResolution_0.25.foamDensity_2e-06
drwx--S--- 2 cdoutrix aml_cs 4.0K Apr 29 08:09 domain_72.laserPowerMult_1.0.minimalAle_0.NODES_8.NODES_XENA_1.PROC_288.PROCS_XENA_36.meshResolution_0.25.foamDensity_0.35
drwx--S--- 2 cdoutrix aml_cs 4.0K Apr 29 08:09 domain_288.laserPowerMult_1.0.minimalAle_0.NODES_32.NODES_XENA_2.PROC_1152.PROCS_XENA_72.meshResolution_0.5.foamDensity_2e-06
drwx--S--- 2 cdoutrix aml_cs 4.0K Apr 29 08:10 domain_288.laserPowerMult_1.0.minimalAle_0.NODES_32.NODES_XENA_2.PROC_1152.PROCS_XENA_72.meshResolution_0.5.foamDensity_0.35
drwx--S--- 2 cdoutrix aml_cs 4.0K Apr 29 08:10 domain_18.laserPowerMult_1.0.minimalAle_1.NODES_2.NODES_XENA_1.PROC_72.PROCS_XENA_18.meshResolution_0.15.foamDensity_2e-06
drwx--S--- 2 cdoutrix aml_cs 4.0K Apr 29 08:10 domain_18.laserPowerMult_1.0.minimalAle_1.NODES_2.NODES_XENA_1.PROC_72.PROCS_XENA_18.meshResolution_0.15.foamDensity_0.35
drwx--S--- 2 cdoutrix aml_cs 4.0K Apr 29 08:10 domain_72.laserPowerMult_1.0.minimalAle_1.NODES_8.NODES_XENA_1.PROC_288.PROCS_XENA_36.meshResolution_0.25.foamDensity_2e-06
drwx--S--- 2 cdoutrix aml_cs 4.0K Apr 29 08:10 domain_72.laserPowerMult_1.0.minimalAle_1.NODES_8.NODES_XENA_1.PROC_288.PROCS_XENA_36.meshResolution_0.25.foamDensity_0.35
drwx--S--- 2 cdoutrix aml_cs 4.0K Apr 29 08:10 domain_288.laserPowerMult_1.0.minimalAle_1.NODES_32.NODES_XENA_2.PROC_1152.PROCS_XENA_72.meshResolution_0.5.foamDensity_2e-06
drwx--S--- 2 cdoutrix aml_cs 4.0K Apr 29 08:10 domain_288.laserPowerMult_1.0.minimalAle_1.NODES_32.NODES_XENA_2.PROC_1152.PROCS_XENA_72.meshResolution_0.5.foamDensity_0.35

@doutriaux1
Copy link
Collaborator Author

forget the error in the log I have a typo: $(run_PATH) instead of $(RUN_PATH) but that doesn't expalin why only one kosh is generated. I'll keep looking in the log

@FrankD412
Copy link
Member

@doutriaux1 -- You defined the variable as RUN_PATH -- it seems that you're using it as run_PATH.

In your directory_permissions step change chgrp -R aml_cs $(run_PATH)/$RUNDIR to chgrp -R aml_cs $(RUN_PATH)/$RUNDIR

@doutriaux1
Copy link
Collaborator Author

the log seems to indicate it's expanding ok:

==================================================
Expanding step 'kosh'
==================================================
-------- Used Parameters --------
{'LASPOWERMULT', 'NODES', 'RESOLUTION', 'DOMAIN', 'NODES_XENA', 'PROCS_XENA', 'PROC', 'RHO_FOAM2', 'MINIMALALE'}
---------------------------------
2020-04-29 08:09:28,064 - maestrowf.datastructures.core.study:_stage:616 - INFO -
**********************************
Combo [laserPowerMult_1.0.minimalAle_0.meshResolution_0.15.domain_18.foamDensity_2e-06.PROC_72.NODES_2.PROCS_XENA_18.NODES_XENA_1]
**********************************
2020-04-29 08:09:28,064 - maestrowf.datastructures.core.study:_stage:645 - INFO - Searching for workspaces...
cmd = echo 72.2.18.0.15.2e-06.1.0.0.18.1
export RUNDIR=HWH_$( basename $(WORKSPACE))
python /usr/workspace/aml_cs/ALE/LAGER/data-generation/Hohlraum/add_to_kosh.py --store=/usr/workspace/aml_cs/kosh/kosh_store.sql --root /usr/workspace/aml_cs/ALE/Hohlraum -n $RUNDIR

2020-04-29 08:09:28,064 - maestrowf.datastructures.core.study:_stage:676 - INFO - New cmd = echo 72.2.18.0.15.2e-06.1.0.0.18.1
export RUNDIR=HWH_$( basename $(WORKSPACE))
python /usr/workspace/aml_cs/ALE/LAGER/data-generation/Hohlraum/add_to_kosh.py --store=/usr/workspace/aml_cs/kosh/kosh_store.sql --root /usr/workspace/aml_cs/ALE/Hohlraum -n $RUNDIR

2020-04-29 08:09:28,064 - maestrowf.datastructures.core.study:_stage:616 - INFO -
**********************************
Combo [laserPowerMult_1.0.minimalAle_0.meshResolution_0.15.domain_18.foamDensity_0.35.PROC_72.NODES_2.PROCS_XENA_18.NODES_XENA_1]
**********************************
2020-04-29 08:09:28,064 - maestrowf.datastructures.core.study:_stage:645 - INFO - Searching for workspaces...
cmd = echo 72.2.18.0.15.0.35.1.0.0.18.1
export RUNDIR=HWH_$( basename $(WORKSPACE))
python /usr/workspace/aml_cs/ALE/LAGER/data-generation/Hohlraum/add_to_kosh.py --store=/usr/workspace/aml_cs/kosh/kosh_store.sql --root /usr/workspace

@doutriaux1
Copy link
Collaborator Author

@FrankD412 changing to RUN_PATH does not seem to make a difference

@doutriaux1
Copy link
Collaborator Author

@FrankD412 even if it fails the status should at least indicate what has run/intialiazed/failed etc. No?

@FrankD412
Copy link
Member

@doutriaux1 -- It should. I'll have to sit down with the sample or schedule a meeting with you to dive deeper. There isn't anything blatant that I'm seeing that's wrong here, let me mess with it on my end and I'll see what I can find.

@doutriaux1
Copy link
Collaborator Author

Ok thanks. It's really odd since the full one (with slurm jobs) works fine.

@doutriaux1
Copy link
Collaborator Author

@FrankD412 if that "helps" things get worse with the repo's head:

 maestro run -p maestro_custom_generator.py maestro_bug.yaml
[2020-04-29 08:46:30: INFO] INFO Logging Level -- Enabled
[2020-04-29 08:46:30: WARNING] WARNING Logging Level -- Enabled
[2020-04-29 08:46:30: CRITICAL] CRITICAL Logging Level -- Enabled
[2020-04-29 08:46:30: INFO] Loading specification -- path = maestro_bug.yaml
[2020-04-29 08:46:30: ERROR] ('variables',)
Traceback (most recent call last):
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/lib/python3.8/site-packages/maestrowf-1.1.7.dev1-py3.8.egg/maestrowf/datastructures/yamlspecification.py", line 112, in load_specification
    specification = cls.load_specification_from_stream(data)
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/lib/python3.8/site-packages/maestrowf-1.1.7.dev1-py3.8.egg/maestrowf/datastructures/yamlspecification.py", line 155, in load_specification_from_stream
    specification.verify()
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/lib/python3.8/site-packages/maestrowf-1.1.7.dev1-py3.8.egg/maestrowf/datastructures/yamlspecification.py", line 162, in verify
    self.verify_environment()
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/lib/python3.8/site-packages/maestrowf-1.1.7.dev1-py3.8.egg/maestrowf/datastructures/yamlspecification.py", line 280, in verify_environment
    keys_seen = self._verify_variables()
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/lib/python3.8/site-packages/maestrowf-1.1.7.dev1-py3.8.egg/maestrowf/datastructures/yamlspecification.py", line 200, in _verify_variables
    for key, value in self.environment["variables"].items():
KeyError: 'variables'
Traceback (most recent call last):
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/bin/maestro", line 11, in <module>
    load_entry_point('maestrowf==1.1.7.dev1', 'console_scripts', 'maestro')()
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/lib/python3.8/site-packages/maestrowf-1.1.7.dev1-py3.8.egg/maestrowf/maestro.py", line 424, in main
    rc = args.func(args)
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/lib/python3.8/site-packages/maestrowf-1.1.7.dev1-py3.8.egg/maestrowf/maestro.py", line 130, in run_study
    spec = YAMLSpecification.load_specification(args.specification)
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/lib/python3.8/site-packages/maestrowf-1.1.7.dev1-py3.8.egg/maestrowf/datastructures/yamlspecification.py", line 116, in load_specification
    raise e
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/lib/python3.8/site-packages/maestrowf-1.1.7.dev1-py3.8.egg/maestrowf/datastructures/yamlspecification.py", line 112, in load_specification
    specification = cls.load_specification_from_stream(data)
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/lib/python3.8/site-packages/maestrowf-1.1.7.dev1-py3.8.egg/maestrowf/datastructures/yamlspecification.py", line 155, in load_specification_from_stream
    specification.verify()
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/lib/python3.8/site-packages/maestrowf-1.1.7.dev1-py3.8.egg/maestrowf/datastructures/yamlspecification.py", line 162, in verify
    self.verify_environment()
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/lib/python3.8/site-packages/maestrowf-1.1.7.dev1-py3.8.egg/maestrowf/datastructures/yamlspecification.py", line 280, in verify_environment
    keys_seen = self._verify_variables()
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/lib/python3.8/site-packages/maestrowf-1.1.7.dev1-py3.8.egg/maestrowf/datastructures/yamlspecification.py", line 200, in _verify_variables
    for key, value in self.environment["variables"].items():
KeyError: 'variables'
(kosh) [cdoutrix@rztopaz188:Hohlraum]$

@FrankD412
Copy link
Member

Oh, that looks like it may be a bug with the last release. Can you file that in a separate issue? That's a new feature that was added about a week ago.

I do have a curiosity question related: It looks like you're putting statically defined items in labels -- those should probably go in variables. Is there a reason why you prefer the labels section?

@doutriaux1
Copy link
Collaborator Author

@FrankD412 not really, I probably just copy/pasted from another example. And maybe because these do not "vary".

@FrankD412
Copy link
Member

Got it -- was just curious if there was a use case for it that I should be supporting. Thanks for the info.

@FrankD412
Copy link
Member

@FrankD412 if that "helps" things get worse with the repo's head:

 maestro run -p maestro_custom_generator.py maestro_bug.yaml
[2020-04-29 08:46:30: INFO] INFO Logging Level -- Enabled
[2020-04-29 08:46:30: WARNING] WARNING Logging Level -- Enabled
[2020-04-29 08:46:30: CRITICAL] CRITICAL Logging Level -- Enabled
[2020-04-29 08:46:30: INFO] Loading specification -- path = maestro_bug.yaml
[2020-04-29 08:46:30: ERROR] ('variables',)
Traceback (most recent call last):
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/lib/python3.8/site-packages/maestrowf-1.1.7.dev1-py3.8.egg/maestrowf/datastructures/yamlspecification.py", line 112, in load_specification
    specification = cls.load_specification_from_stream(data)
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/lib/python3.8/site-packages/maestrowf-1.1.7.dev1-py3.8.egg/maestrowf/datastructures/yamlspecification.py", line 155, in load_specification_from_stream
    specification.verify()
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/lib/python3.8/site-packages/maestrowf-1.1.7.dev1-py3.8.egg/maestrowf/datastructures/yamlspecification.py", line 162, in verify
    self.verify_environment()
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/lib/python3.8/site-packages/maestrowf-1.1.7.dev1-py3.8.egg/maestrowf/datastructures/yamlspecification.py", line 280, in verify_environment
    keys_seen = self._verify_variables()
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/lib/python3.8/site-packages/maestrowf-1.1.7.dev1-py3.8.egg/maestrowf/datastructures/yamlspecification.py", line 200, in _verify_variables
    for key, value in self.environment["variables"].items():
KeyError: 'variables'
Traceback (most recent call last):
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/bin/maestro", line 11, in <module>
    load_entry_point('maestrowf==1.1.7.dev1', 'console_scripts', 'maestro')()
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/lib/python3.8/site-packages/maestrowf-1.1.7.dev1-py3.8.egg/maestrowf/maestro.py", line 424, in main
    rc = args.func(args)
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/lib/python3.8/site-packages/maestrowf-1.1.7.dev1-py3.8.egg/maestrowf/maestro.py", line 130, in run_study
    spec = YAMLSpecification.load_specification(args.specification)
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/lib/python3.8/site-packages/maestrowf-1.1.7.dev1-py3.8.egg/maestrowf/datastructures/yamlspecification.py", line 116, in load_specification
    raise e
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/lib/python3.8/site-packages/maestrowf-1.1.7.dev1-py3.8.egg/maestrowf/datastructures/yamlspecification.py", line 112, in load_specification
    specification = cls.load_specification_from_stream(data)
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/lib/python3.8/site-packages/maestrowf-1.1.7.dev1-py3.8.egg/maestrowf/datastructures/yamlspecification.py", line 155, in load_specification_from_stream
    specification.verify()
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/lib/python3.8/site-packages/maestrowf-1.1.7.dev1-py3.8.egg/maestrowf/datastructures/yamlspecification.py", line 162, in verify
    self.verify_environment()
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/lib/python3.8/site-packages/maestrowf-1.1.7.dev1-py3.8.egg/maestrowf/datastructures/yamlspecification.py", line 280, in verify_environment
    keys_seen = self._verify_variables()
  File "/g/g19/cdoutrix/miniconda3/envs/kosh/lib/python3.8/site-packages/maestrowf-1.1.7.dev1-py3.8.egg/maestrowf/datastructures/yamlspecification.py", line 200, in _verify_variables
    for key, value in self.environment["variables"].items():
KeyError: 'variables'
(kosh) [cdoutrix@rztopaz188:Hohlraum]$

I just created the new issue, didn't realize I could just create it off the comment. Just an FYI not to worry about making a new one.

@FrankD412
Copy link
Member

@doutriaux1 -- @ben-bay just fixed the variable section bug. I've got the example expanding on my own machine. Will be looking at this shortly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Description of reproducible unexpected behavior.
Projects
None yet
Development

No branches or pull requests

2 participants