Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

Commit

Permalink
Fix gp tuner (#1592)
Browse files Browse the repository at this point in the history
* fix gp tuner
  • Loading branch information
chicm-ms authored Oct 9, 2019
1 parent e93d2c2 commit 313b0f6
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 3 deletions.
2 changes: 1 addition & 1 deletion azure-pipelines.yml
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ jobs:
- script: |
python3 -m pip install torch==0.4.1 --user
python3 -m pip install torchvision==0.2.1 --user
python3 -m pip install tensorflow --user
python3 -m pip install tensorflow==1.13.1 --user
displayName: 'Install dependencies for integration'
- script: |
source install.sh
Expand Down
4 changes: 2 additions & 2 deletions src/sdk/pynni/nni/gp_tuner/gp_tuner.py
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@ def update_search_space(self, search_space):
"""
self._space = TargetSpace(search_space, self._random_state)

def generate_parameters(self, parameter_id):
def generate_parameters(self, parameter_id, **kwargs):
"""Generate next parameter for trial
If the number of trial result is lower than cold start number,
gp will first randomly generate some parameters.
Expand Down Expand Up @@ -123,7 +123,7 @@ def generate_parameters(self, parameter_id):
logger.info("Generate paramageters:\n %s", results)
return results

def receive_trial_result(self, parameter_id, parameters, value):
def receive_trial_result(self, parameter_id, parameters, value, **kwargs):
"""Tuner receive result from trial.
Parameters
Expand Down

0 comments on commit 313b0f6

Please sign in to comment.