Skip to content

Commit ca0b82b

Browse files
authored
Fix doc examples: cannot import name (huggingface#14698)
* Fix doc examples: cannot import name * remove copy because of some necessary minor changes (maybe add copy to the individual methods instead) * Keep copy with some modifications Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
1 parent fc74c84 commit ca0b82b

File tree

1 file changed

+5
-5
lines changed

1 file changed

+5
-5
lines changed

src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -2117,7 +2117,7 @@ def forward(
21172117
Indices of input sequence tokens in the vocabulary. Padding will be ignored by default should you
21182118
provide it.
21192119
2120-
Indices can be obtained using :class:`~transformers.BigBirdPegasusTokenizer`. See
2120+
Indices can be obtained using :class:`~transformers.PegasusTokenizer`. See
21212121
:meth:`transformers.PreTrainedTokenizer.encode` and :meth:`transformers.PreTrainedTokenizer.__call__`
21222122
for details.
21232123
@@ -2862,7 +2862,7 @@ def forward(self, *args, **kwargs):
28622862
return self.decoder(*args, **kwargs)
28632863

28642864

2865-
# Copied from transformers.models.bart.modeling_bart.BartForCausalLM with Bart->BigBirdPegasus, 'facebook/bart-large'->"google/bigbird-pegasus-large-arxiv"
2865+
# Copied from transformers.models.bart.modeling_bart.BartForCausalLM with BartDecoderWrapper->BigBirdPegasusDecoderWrapper, BartForCausalLM->BigBirdPegasusForCausalLM, BartPreTrainedModel->BigBirdPegasusPreTrainedModel, BartTokenizer->PegasusTokenizer, 'facebook/bart-large'->"google/bigbird-pegasus-large-arxiv"
28662866
class BigBirdPegasusForCausalLM(BigBirdPegasusPreTrainedModel):
28672867
def __init__(self, config):
28682868
config = copy.deepcopy(config)
@@ -2917,7 +2917,7 @@ def forward(
29172917
Indices of input sequence tokens in the vocabulary. Padding will be ignored by default should you
29182918
provide it.
29192919
2920-
Indices can be obtained using :class:`~transformers.BigBirdPegasusTokenizer`. See
2920+
Indices can be obtained using :class:`~transformers.PegasusTokenizer`. See
29212921
:meth:`transformers.PreTrainedTokenizer.encode` and :meth:`transformers.PreTrainedTokenizer.__call__`
29222922
for details.
29232923
@@ -2985,9 +2985,9 @@ def forward(
29852985
29862986
Example::
29872987
2988-
>>> from transformers import BigBirdPegasusTokenizer, BigBirdPegasusForCausalLM
2988+
>>> from transformers import PegasusTokenizer, BigBirdPegasusForCausalLM
29892989
2990-
>>> tokenizer = BigBirdPegasusTokenizer.from_pretrained("google/bigbird-pegasus-large-arxiv")
2990+
>>> tokenizer = PegasusTokenizer.from_pretrained("google/bigbird-pegasus-large-arxiv")
29912991
>>> model = BigBirdPegasusForCausalLM.from_pretrained("google/bigbird-pegasus-large-arxiv", add_cross_attention=False)
29922992
>>> assert model.config.is_decoder, f"{model.__class__} has to be configured as a decoder."
29932993
>>> inputs = tokenizer("Hello, my dog is cute", return_tensors="pt")

0 commit comments

Comments
 (0)