AI/ML

Deep Contextual Clinical Prediction with Reverse Distillation. (arXiv:2007.05611v1 [cs.LG])


Healthcare providers are increasingly using learned methods to predict and
understand long-term patient outcomes in order to make meaningful
interventions. However, despite innovations in this area, deep learning models
often struggle to match performance of shallow linear models in predicting
these outcomes, making it difficult to leverage such techniques in practice. In
this work, motivated by the task of clinical prediction from insurance claims,
we present a new technique called reverse distillation which pretrains deep
models by using high-performing linear models for initialization. We make use
of the longitudinal structure of insurance claims datasets to develop Self
Attention with Reverse Distillation, or SARD, an architecture that utilizes a
combination of contextual embedding, temporal embedding and self-attention
mechanisms and most critically is trained via reverse distillation. SARD
outperforms state-of-the-art methods on multiple clinical prediction outcomes,
with ablation studies revealing that reverse distillation is a primary driver
of these improvements.

Source link

Related posts

Two in five ‘AI startups’ essentially have no AI, mega-survey of nearly 3,000 upstarts finds

Newsemia

Dense Breasts and Coding Mammography

Newsemia

Amazon patents a real-time accent translator

Newsemia

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy

COVID-19

COVID-19 (Coronavirus) is a new illness that is having a major effect on all businesses globally LIVE COVID-19 STATISTICS FOR World