Intermediate task fine-tuning in cancer classification
Reducing the amount of annotated data required to train predictive models is one of the main challenges in applying artificial intelligence to histopathology. In this paper, we propose a method to enhance the performance of deep learning models trained with limited data in the field of digital patho...
Autores principales: | , , |
---|---|
Formato: | Articulo |
Lenguaje: | Inglés |
Publicado: |
2023
|
Materias: | |
Acceso en línea: | http://sedici.unlp.edu.ar/handle/10915/160074 |
Aporte de: |
id |
I19-R120-10915-160074 |
---|---|
record_format |
dspace |
spelling |
I19-R120-10915-1600742023-11-13T20:02:28Z http://sedici.unlp.edu.ar/handle/10915/160074 Intermediate task fine-tuning in cancer classification Clasificación de cancer mediante transferencia de conocimiento con tarea intermedia García, Mario Alejandro Gramática, Martín Nicolás Ricapito, Juan Pablo 2023-10 2023-11-13T14:10:18Z en Ciencias Informáticas deep learning digital pathology intermediate task fine-tuning histopathology transfer learning ajuste fino con tarea intermedia aprendizaje profundo histopatología patología digital Transferencia de conocimientos Reducing the amount of annotated data required to train predictive models is one of the main challenges in applying artificial intelligence to histopathology. In this paper, we propose a method to enhance the performance of deep learning models trained with limited data in the field of digital pathology. The method relies on a two-stage transfer learning process, where an intermediate model serves as a bridge between a pretrained model on ImageNet and the final cancer classification model. The intermediate model is fine-tuned with a dataset of over 4,000,000 images weakly labeled with clinical data extracted from TCGA program. The model obtained through the proposed method significantly outperforms a model trained with a traditional transfer learning process. Reducir la cantidad de datos etiquetados necesarios para entrenar modelos predictivos es uno de los principales desafíos para la aplicación de la inteligencia artificial en patología digital. En este trabajo se propone un método para mejorar la capacidad de predicción de redes neuronales profundas entrenadas con cantidades limitadas de imágenes de patología digital. El método es un proceso de transfer learning de dos etapas, donde se utiliza un modelo intermedio como puente entre un modelo preentrenado con ImageNet y un modelo final de clasificación de cáncer. El modelo intermedio es ajustado con un dataset de más de 4.000.000 de imágenes débilmente etiquetadas con datos clínicos extraídos del programa TCGA. El modelo obtenido a través del método propuesto mejora significativamente los resultados de un modelo ajustado con el proceso tradicional de transfer learning. Facultad de Informática Articulo Articulo http://creativecommons.org/licenses/by-nc/4.0/ Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) application/pdf 135-144 |
institution |
Universidad Nacional de La Plata |
institution_str |
I-19 |
repository_str |
R-120 |
collection |
SEDICI (UNLP) |
language |
Inglés |
topic |
Ciencias Informáticas deep learning digital pathology intermediate task fine-tuning histopathology transfer learning ajuste fino con tarea intermedia aprendizaje profundo histopatología patología digital Transferencia de conocimientos |
spellingShingle |
Ciencias Informáticas deep learning digital pathology intermediate task fine-tuning histopathology transfer learning ajuste fino con tarea intermedia aprendizaje profundo histopatología patología digital Transferencia de conocimientos García, Mario Alejandro Gramática, Martín Nicolás Ricapito, Juan Pablo Intermediate task fine-tuning in cancer classification |
topic_facet |
Ciencias Informáticas deep learning digital pathology intermediate task fine-tuning histopathology transfer learning ajuste fino con tarea intermedia aprendizaje profundo histopatología patología digital Transferencia de conocimientos |
description |
Reducing the amount of annotated data required to train predictive models is one of the main challenges in applying artificial intelligence to histopathology. In this paper, we propose a method to enhance the performance of deep learning models trained with limited data in the field of digital pathology. The method relies on a two-stage transfer learning process, where an intermediate model serves as a bridge between a pretrained model on ImageNet and the final cancer classification model. The intermediate model is fine-tuned with a dataset of over 4,000,000 images weakly labeled with clinical data extracted from TCGA program. The model obtained through the proposed method significantly outperforms a model trained with a traditional transfer learning process. |
format |
Articulo Articulo |
author |
García, Mario Alejandro Gramática, Martín Nicolás Ricapito, Juan Pablo |
author_facet |
García, Mario Alejandro Gramática, Martín Nicolás Ricapito, Juan Pablo |
author_sort |
García, Mario Alejandro |
title |
Intermediate task fine-tuning in cancer classification |
title_short |
Intermediate task fine-tuning in cancer classification |
title_full |
Intermediate task fine-tuning in cancer classification |
title_fullStr |
Intermediate task fine-tuning in cancer classification |
title_full_unstemmed |
Intermediate task fine-tuning in cancer classification |
title_sort |
intermediate task fine-tuning in cancer classification |
publishDate |
2023 |
url |
http://sedici.unlp.edu.ar/handle/10915/160074 |
work_keys_str_mv |
AT garciamarioalejandro intermediatetaskfinetuningincancerclassification AT gramaticamartinnicolas intermediatetaskfinetuningincancerclassification AT ricapitojuanpablo intermediatetaskfinetuningincancerclassification AT garciamarioalejandro clasificaciondecancermediantetransferenciadeconocimientocontareaintermedia AT gramaticamartinnicolas clasificaciondecancermediantetransferenciadeconocimientocontareaintermedia AT ricapitojuanpablo clasificaciondecancermediantetransferenciadeconocimientocontareaintermedia |
_version_ |
1807221824199589888 |