forked from huggingface/transformers
-
Notifications
You must be signed in to change notification settings - Fork 1
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Add trajectory transformer (huggingface#17141)
* Add trajectory transformer Fix model init Fix end of lines for .mdx files Add trajectory transformer model to toctree Add forward input docs Fix docs, remove prints, simplify prediction test Apply suggestions from code review Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> Apply suggestions from code review Co-authored-by: Lysandre Debut <lysandre@huggingface.co> Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> Update docs, more descriptive comments Apply suggestions from code review Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com> Update readme Small comment update and add conversion script Rebase and reformat Fix copies Fix rebase, remove duplicates Fix rebase, remove duplicates * Remove tapex * Remove tapex * Remove tapex
- Loading branch information
1 parent
c352640
commit d6b8e9c
Showing
19 changed files
with
1,297 additions
and
1 deletion.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,3 +1,4 @@ | ||
*.py eol=lf | ||
*.rst eol=lf | ||
*.md eol=lf | ||
*.md eol=lf | ||
*.mdx eol=lf |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,49 @@ | ||
<!--Copyright 2022 The HuggingFace Team. All rights reserved. | ||
|
||
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with | ||
the License. You may obtain a copy of the License at | ||
|
||
http://www.apache.org/licenses/LICENSE-2.0 | ||
|
||
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on | ||
an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the | ||
specific language governing permissions and limitations under the License. | ||
--> | ||
|
||
# Trajectory Transformer | ||
|
||
## Overview | ||
|
||
The Trajectory Transformer model was proposed in [Offline Reinforcement Learning as One Big Sequence Modeling Problem](https://arxiv.org/abs/2106.02039) by Michael Janner, Qiyang Li, Sergey Levine. | ||
|
||
The abstract from the paper is the following: | ||
|
||
*Reinforcement learning (RL) is typically concerned with estimating stationary policies or single-step models, | ||
leveraging the Markov property to factorize problems in time. However, we can also view RL as a generic sequence | ||
modeling problem, with the goal being to produce a sequence of actions that leads to a sequence of high rewards. | ||
Viewed in this way, it is tempting to consider whether high-capacity sequence prediction models that work well | ||
in other domains, such as natural-language processing, can also provide effective solutions to the RL problem. | ||
To this end, we explore how RL can be tackled with the tools of sequence modeling, using a Transformer architecture | ||
to model distributions over trajectories and repurposing beam search as a planning algorithm. Framing RL as sequence | ||
modeling problem simplifies a range of design decisions, allowing us to dispense with many of the components common | ||
in offline RL algorithms. We demonstrate the flexibility of this approach across long-horizon dynamics prediction, | ||
imitation learning, goal-conditioned RL, and offline RL. Further, we show that this approach can be combined with | ||
existing model-free algorithms to yield a state-of-the-art planner in sparse-reward, long-horizon tasks.* | ||
|
||
Tips: | ||
|
||
This Transformer is used for deep reinforcement learning. To use it, you need to create sequences from | ||
actions, states and rewards from all previous timesteps. This model will treat all these elements together | ||
as one big sequence (a trajectory). | ||
|
||
This model was contributed by [CarlCochet](https://huggingface.co/CarlCochet). The original code can be found [here](https://github.com/jannerm/trajectory-transformer). | ||
|
||
## TrajectoryTransformerConfig | ||
|
||
[[autodoc]] TrajectoryTransformerConfig | ||
|
||
|
||
## TrajectoryTransformerModel | ||
|
||
[[autodoc]] TrajectoryTransformerModel | ||
- forward |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -116,6 +116,7 @@ | |
t5, | ||
tapas, | ||
tapex, | ||
trajectory_transformer, | ||
transfo_xl, | ||
trocr, | ||
unispeech, | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.