Unlike formal languages, natural languages are unstructured and more complex. Understanding and generating meaningful text are the goals of natural language processing (NLP). Deep learning recently had a significant impact on this field. Innovative and effective transformer-based models have achieved state-of-the-art results on a wide range of NLP tasks, including those working on specialized clinical and biomedical text. The most widely-adopted models (BERT and GPT) are here described along with their domain-specific versions and applications in the biomedical domain.
Transformer-Based Biomedical Text Extraction
Al Khalaf, Ruba;Bernasconi, Anna
2024-01-01
Abstract
Unlike formal languages, natural languages are unstructured and more complex. Understanding and generating meaningful text are the goals of natural language processing (NLP). Deep learning recently had a significant impact on this field. Innovative and effective transformer-based models have achieved state-of-the-art results on a wide range of NLP tasks, including those working on specialized clinical and biomedical text. The most widely-adopted models (BERT and GPT) are here described along with their domain-specific versions and applications in the biomedical domain.File in questo prodotto:
File | Dimensione | Formato | |
---|---|---|---|
Transformer_based_biomedical_text_extraction__chapter.pdf
Accesso riservato
:
Pre-Print (o Pre-Refereeing)
Dimensione
251.38 kB
Formato
Adobe PDF
|
251.38 kB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.