We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
2 parents f26d938 + 9811e94 commit 4ddc773Copy full SHA for 4ddc773
README.md
@@ -33,10 +33,10 @@ Our implementation is based on Huggingface Transformers. You can use the followi
33
34
```python
35
from transformers import AutoTokenizer
36
-from modeling_ttt import TTTForCausalLM, TTTConfig, TTT_STANDARD_CONFIGS
+from ttt import TTTForCausalLM, TTTConfig, TTT_STANDARD_CONFIGS
37
38
# Initializing a TTT ttt-1b style configuration
39
-# configuration = TTTConfig(**TTT_STANDARD_CONFIGS['ttt-1b']) is equivalent to the following
+# configuration = TTTConfig(**TTT_STANDARD_CONFIGS['1b']) is equivalent to the following
40
configuration = TTTConfig()
41
42
# Initializing a model from the ttt-1b style configuration
0 commit comments