Preterm infants’ spontaneous movements monitoring is a valuable ally to early recognise neuro-motor impairments, especially common in infants born before term. Currently, highly-specialized clinicians assess the movements quality on the basis of subjective, discontinuous, and time-consuming observations. To support clinicians, automatic monitoring systems have been developed, among which Deep Learning algorithms (mainly Convolutional Neural Networks (CNNs)) are up-to-date the most suitable and less invasive ones. Indeed, research in this field has devised highly reliable models, but has shown a tendency to neglect their computational costs. In fact, these models usually require massive computations, which, in turn, require expensive hardware and are environmentally unsustainable. As a consequence, the costs of these models risk to make their application to the actual clinical practice a privilege. However, the ultimate goal of research, especially in healthcare, should be designing technologies that are fairly accessible to as many people as possible. In light of this, this work analyzes three CNNs for preterm infants’ movements monitoring on the basis of their computational requirements. The two best-performing networks achieve very similar accuracy (Dice Similarity Coefficient around 0.88) although one of them, which was designed by us following the principles of Green AI, requires half as many Floating Point Operations (47×109 47 × 10 9 vs 101×109 101 × 10 9 ). Our research show that it is possible to design highly-performing and cost-efficient Convolutional Neural Networks for clinical applications .

Some Ethical Remarks on Deep Learning-Based Movements Monitoring for Preterm Infants: Green AI or Red AI?

Cacciatore, A.;Tiribelli, S.;Pigliapoco, S.;
2022-01-01

Abstract

Preterm infants’ spontaneous movements monitoring is a valuable ally to early recognise neuro-motor impairments, especially common in infants born before term. Currently, highly-specialized clinicians assess the movements quality on the basis of subjective, discontinuous, and time-consuming observations. To support clinicians, automatic monitoring systems have been developed, among which Deep Learning algorithms (mainly Convolutional Neural Networks (CNNs)) are up-to-date the most suitable and less invasive ones. Indeed, research in this field has devised highly reliable models, but has shown a tendency to neglect their computational costs. In fact, these models usually require massive computations, which, in turn, require expensive hardware and are environmentally unsustainable. As a consequence, the costs of these models risk to make their application to the actual clinical practice a privilege. However, the ultimate goal of research, especially in healthcare, should be designing technologies that are fairly accessible to as many people as possible. In light of this, this work analyzes three CNNs for preterm infants’ movements monitoring on the basis of their computational requirements. The two best-performing networks achieve very similar accuracy (Dice Similarity Coefficient around 0.88) although one of them, which was designed by us following the principles of Green AI, requires half as many Floating Point Operations (47×109 47 × 10 9 vs 101×109 101 × 10 9 ). Our research show that it is possible to design highly-performing and cost-efficient Convolutional Neural Networks for clinical applications .
2022
978-3-031-13324-4
File in questo prodotto:
File Dimensione Formato  
Some Ethical Remarks on Deep Learning-Based Movements Monitoring for Preterm Infants: Green AI or Red AI?.pdf

solo utenti autorizzati

Descrizione: Capitolo in volume
Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Copyright dell'editore
Dimensione 741.86 kB
Formato Adobe PDF
741.86 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11393/304670
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact