Skip to content

Commit fc74c84

Browse files
authored
Swap TF and PT code inside two blocks (huggingface#14742)
1 parent 8362d07 commit fc74c84

File tree

1 file changed

+6
-6
lines changed

1 file changed

+6
-6
lines changed

docs/source/quicktour.mdx

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -334,21 +334,21 @@ PyTorch and TensorFlow: any model saved as before can be loaded back either in P
334334
If you would like to load your saved model in the other framework, first make sure it is installed:
335335

336336
```bash
337-
pip install tensorflow
338-
===PT-TF-SPLIT===
339337
pip install torch
338+
===PT-TF-SPLIT===
339+
pip install tensorflow
340340
```
341341

342342
Then, use the corresponding Auto class to load it like this:
343343

344344
```py
345-
>>> from transformers import TFAutoModel
346-
>>> tokenizer = AutoTokenizer.from_pretrained(pt_save_directory)
347-
>>> tf_model = TFAutoModel.from_pretrained(pt_save_directory, from_pt=True)
348-
===PT-TF-SPLIT===
349345
>>> from transformers import AutoModel
350346
>>> tokenizer = AutoTokenizer.from_pretrained(tf_save_directory)
351347
>>> pt_model = AutoModel.from_pretrained(tf_save_directory, from_tf=True)
348+
===PT-TF-SPLIT===
349+
>>> from transformers import TFAutoModel
350+
>>> tokenizer = AutoTokenizer.from_pretrained(pt_save_directory)
351+
>>> tf_model = TFAutoModel.from_pretrained(pt_save_directory, from_pt=True)
352352
```
353353

354354
Lastly, you can also ask the model to return all hidden states and all attention weights if you need them:

0 commit comments

Comments
 (0)