Skip to content

Commit

Permalink
[pre-commit.ci] auto fixes from pre-commit.com hooks
Browse files Browse the repository at this point in the history
for more information, see https://pre-commit.ci
  • Loading branch information
pre-commit-ci[bot] committed Sep 19, 2023
1 parent 56d6726 commit decea41
Show file tree
Hide file tree
Showing 6 changed files with 10 additions and 10 deletions.
2 changes: 1 addition & 1 deletion bert/bert-base-japanese-v3/vocab.txt
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
[unused7]
[unused8]
[unused9]

!
"
#
Expand Down
10 changes: 5 additions & 5 deletions bert/chinese-roberta-wwm-ext-large/README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
language:
language:
- zh
tags:
- bert
Expand All @@ -9,9 +9,9 @@ license: "apache-2.0"
# Please use 'Bert' related functions to load this model!

## Chinese BERT with Whole Word Masking
For further accelerating Chinese natural language processing, we provide **Chinese pre-trained BERT with Whole Word Masking**.
For further accelerating Chinese natural language processing, we provide **Chinese pre-trained BERT with Whole Word Masking**.

**[Pre-Training with Whole Word Masking for Chinese BERT](https://arxiv.org/abs/1906.08101)**
**[Pre-Training with Whole Word Masking for Chinese BERT](https://arxiv.org/abs/1906.08101)**
Yiming Cui, Wanxiang Che, Ting Liu, Bing Qin, Ziqing Yang, Shijin Wang, Guoping Hu

This repository is developed based on:https://github.com/google-research/bert
Expand Down Expand Up @@ -46,12 +46,12 @@ If you find the technical report or resource is useful, please cite the followin
pages = "657--668",
}
```
- Secondary: https://arxiv.org/abs/1906.08101
- Secondary: https://arxiv.org/abs/1906.08101
```
@article{chinese-bert-wwm,
title={Pre-Training with Whole Word Masking for Chinese BERT},
author={Cui, Yiming and Che, Wanxiang and Liu, Ting and Qin, Bing and Yang, Ziqing and Wang, Shijin and Hu, Guoping},
journal={arXiv preprint arXiv:1906.08101},
year={2019}
}
```
```
2 changes: 1 addition & 1 deletion bert/chinese-roberta-wwm-ext-large/added_tokens.json
Original file line number Diff line number Diff line change
@@ -1 +1 @@
{}
{}
2 changes: 1 addition & 1 deletion bert/chinese-roberta-wwm-ext-large/special_tokens_map.json
Original file line number Diff line number Diff line change
@@ -1 +1 @@
{"unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]"}
{"unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]"}
2 changes: 1 addition & 1 deletion bert/chinese-roberta-wwm-ext-large/tokenizer.json

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion bert/chinese-roberta-wwm-ext-large/tokenizer_config.json
Original file line number Diff line number Diff line change
@@ -1 +1 @@
{"init_inputs": []}
{"init_inputs": []}

0 comments on commit decea41

Please sign in to comment.