Pretrained BERT Model
PyTrial provides an easy-to-use interface to load and use pretrained BERT model.
from pytrial.model_utils.bert import BERT
# Load pretrained BERT model
model = BERT()
# encode
emb = model.encode('The goal of life is comfort.')
The default pretrained BERT used is emilyalsentzer/Bio_ClinicalBERT. You can swith to other pretrained BERT models by specifying the model name as
# specify model name
model = BERT(bertname='bert-base-uncased')
The passed bertname
should be one of the pretrained BERT models listed in HuggingFace Model Hub.