Skip to content

Commit

Permalink
Fix observations on PPO trainer (#340)
Browse files Browse the repository at this point in the history
* Fix observations on PPO trainer

* tested and fixed the fix
  • Loading branch information
awjuliani authored Feb 15, 2018
1 parent 39df903 commit ef61887
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion python/ppo/trainer.py
Original file line number Diff line number Diff line change
Expand Up @@ -122,7 +122,7 @@ def process_experiences(self, info, time_horizon, gamma, lambd):
else:
feed_dict = {self.model.batch_size: len(info.states)}
if self.use_observations:
for i in range(self.info.observations):
for i in range(len(info.observations)):
feed_dict[self.model.observation_in[i]] = info.observations[i]
if self.use_states:
feed_dict[self.model.state_in] = info.states
Expand Down

0 comments on commit ef61887

Please sign in to comment.