3. Download the pre-trained Bio-BERT models from this [link](https://github.com/dmis-lab/biobert). We use the *BioBERT-Large* in our experiments.
4. Open the shell or cmd in this repo folder. Run this command to install necessary packages.
```cmd
pip install -r requirements.txt
```
## Experiments
1. For Linux systems, we have shell scripts to run the training procedures. You can run the following command:
```cmd
./train.ncbi.sh or ./train.bc5cdr.sh
```
2. You can also input the following command to train the model. There are different choices for some parameters shown in square brackets. The meaning of these parameters are shown in the following tables.
| Parameters | Value | Description|
| ---- | ---- | ---- |
| epoch | int | Training times |
| LAMBDA | float | hyper-parameter in loss function |
| MU | float | hyper-parameter in loss function |
| bert_path | str | folder path of pre-trained BERT model |
| save_pred_result | bool | save the prediction result |
```cmd
python main.py \
--seed 11 \
--epoch 12 \
--LAMBDA 0.125 \
--MU 0.1 \
--dataset [ncbi, cdr] \
--bert_path ./biobert_large \
--save_pred_result \
```
3. After training the model, the test result is saved in the "results" folder. And the weights of the model are saved in the "weights" folder.
4. We also provide the weights of the model to reimplement the results in our
paper. You can download the [weights file](https://pan.baidu.com/s/15DLSb2fvgbOiiv0V0ADFNg)(the extraction code **1234**) and put them into the "weights" folder. Then run the following command: