Skip to content

Commit 80c8533

Browse files
committed
Fix bugs
1 parent 8485087 commit 80c8533

File tree

3 files changed

+5
-5
lines changed

3 files changed

+5
-5
lines changed

EXAMPLES.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ The option `-c` controls where to load predefined configs, you can either specif
3030
For CRF models, you need to specify `--proj` to remove non-projective trees.
3131
Specifying `--mbr` to perform MBR decoding often leads to consistent improvement.
3232

33-
The model finetuned on [`robert-large`](https://huggingface.co/roberta-large) achieves nearly state-of-the-art performance in English dependency parsing.
33+
The model trained by finetuning [`robert-large`](https://huggingface.co/roberta-large) achieves nearly state-of-the-art performance in English dependency parsing.
3434
Here we provide some recommended hyper-parameters (not the best, but good enough).
3535
You are allowed to set values of registered/unregistered parameters in bash to suppress default configs in the file.
3636
```sh
@@ -46,7 +46,7 @@ $ python -u -m supar.cmds.biaffine_dep train -b -d 0 -c biaffine-dep-roberta-en
4646
--epochs=10 \
4747
--update-steps=4
4848
```
49-
The pretrained multilingual model `biaffine-dep-xlmr` takes [`xlm-roberta-large`](https://huggingface.co/xlm-roberta-large) as backbone architecture and finetunes on it.
49+
The pretrained multilingual model `biaffine-dep-xlmr` takes [`xlm-roberta-large`](https://huggingface.co/xlm-roberta-large) as backbone architecture and finetunes it.
5050
The training command is as following:
5151
```sh
5252
$ python -u -m supar.cmds.biaffine_dep train -b -d 0 -c biaffine-dep-xlmr -p model \

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ $ python setup.py install
3131
```
3232

3333
As a prerequisite, the following requirements should be satisfied:
34-
* `python`: >= 3.7
34+
* `python`: >= 3.6
3535
* [`pytorch`](https://github.com/pytorch/pytorch): >= 1.7
3636
* [`transformers`](https://github.com/huggingface/transformers): >= 4.0
3737

@@ -45,7 +45,7 @@ All results are tested on the machine with Intel(R) Xeon(R) CPU E5-2650 v4 @ 2.2
4545

4646
English and Chinese dependency parsing models are trained on PTB and CTB7 respectively.
4747
For each parser, we provide pretrained models that take BiLSTM as encoder.
48-
We also provide models finetuned on pretrained language models from [Huggingface Transformers](https://github.com/huggingface/transformers).
48+
We also provide models trained by finetuning pretrained language models from [Huggingface Transformers](https://github.com/huggingface/transformers).
4949
We use [`robert-large`](https://huggingface.co/roberta-large) for English and [`hfl/chinese-electra-180g-large-discriminator`](https://huggingface.co/hfl/chinese-electra-180g-large-discriminator) for Chinese.
5050
During evaluation, punctuation is ignored in all metrics for PTB.
5151

setup.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -38,6 +38,6 @@
3838
'vi-sdp=supar.cmds.vi_sdp:main'
3939
]
4040
},
41-
python_requires='>=3.7',
41+
python_requires='>=3.6',
4242
zip_safe=False
4343
)

0 commit comments

Comments
 (0)