Please use this identifier to cite or link to this item:
http://dspace.cas.upm.edu.ph:8080/xmlui/handle/123456789/3143
Title: | Distilled BERT models in automated ICD-10 coding |
Authors: | Simangan, Rencio Noel Q. |
Keywords: | ICD-10 Coding Distillation Bidirectional Encoder Representations from Transformers (BERT) Pretrained Language Model (PLM) Natural Language Processing (NLP) |
Issue Date: | Jun-2025 |
Abstract: | Accurate and efficient extraction of ICD-10 codes from electronic medical records (EMRs) remains a critical task for automating clinical documentation and supporting healthcare analytics. However, the large size and computational demands of pre-trained language models (PLMs) pose challenges for deployment in realworld and resource-constrained settings. This study investigates the effectiveness of distilled BERT-based models—specifically CompactBioBERT, DistilBioBERT, Roberta-PM-distill, TinyBioBERT, and Bio-MobileBERT—for ICD-10 code prediction using the PLM-ICD framework on the MIMIC-IV dataset. Evaluation metrics including Micro AUC, Micro Precision, Micro F1, and Precision at K were used to assess model performance. Among the models tested, Roberta-PM-distill achieved the best results with a Micro AUC of 97.91% and a Micro F1 score of 46.15% in addition to maintaining strong performance in P@K metrics. While lower, performance proves comparable to similar studies, providing basis for the viability of distilled models for for scalable and efficient ICD code prediction. A web application was developed to deploy the best-performing model for practical use. |
URI: | http://dspace.cas.upm.edu.ph:8080/xmlui/handle/123456789/3143 |
Appears in Collections: | BS Computer Science SP |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
2025_Simangan RNQ_Distilled BERT Models in Automated ICD-10.pdf Until 9999-01-01 | 2.89 MB | Adobe PDF | View/Open Request a copy |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.