Unlike formal languages, natural languages are unstructured and more complex. Understanding and generating meaningful text are the goals of natural language processing (NLP). Deep learning recently had a significant impact on this field. Innovative and effective transformer-based models have achieved state-of-the-art results on a wide range of NLP tasks, including those working on specialized clinical and biomedical text. The most widely-adopted models (BERT and GPT) are here described along with their domain-specific versions and applications in the biomedical domain.

Transformer-Based Biomedical Text Extraction

Al Khalaf, Ruba;Bernasconi, Anna
2024-01-01

Abstract

Unlike formal languages, natural languages are unstructured and more complex. Understanding and generating meaningful text are the goals of natural language processing (NLP). Deep learning recently had a significant impact on this field. Innovative and effective transformer-based models have achieved state-of-the-art results on a wide range of NLP tasks, including those working on specialized clinical and biomedical text. The most widely-adopted models (BERT and GPT) are here described along with their domain-specific versions and applications in the biomedical domain.
2024
Encyclopedia of Bioinformatics and Computational Biology, 2nd Edition
9780128096338
File in questo prodotto:
File Dimensione Formato  
Transformer_based_biomedical_text_extraction__chapter.pdf

Accesso riservato

: Pre-Print (o Pre-Refereeing)
Dimensione 251.38 kB
Formato Adobe PDF
251.38 kB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1271526
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact