Concatenating Observation Space with Action Space in wrappers.py Causing Error Due to Different Dimensionality #468
Unanswered
PROMOTION01
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm encountering an error while running train.py, and it seems to originate from wrappers.py. Here is the error message I'm receiving:
Upon inspecting the code, it appears that the issue stems from trying to concatenate the observation space and action space, which have different dimensions.
Could anyone provide an explanation or solution for this issue?
Code
`class HistoryWrapper(gym.Wrapper[np.ndarray, np.ndarray, np.ndarray, np.ndarray]):
"""
Stack past observations and actions to give a history to the agent.
`
Beta Was this translation helpful? Give feedback.
All reactions