Search

AI Algorithm shows Accurate Prediction of Alzheimer’s Disease



In Boston, USA, researchers have unveiled an algorithm that has been shown to accurately predict risk and even diagnose for Alzheimer’s disease (AD). The software uses an amalgam of magnetic resonance imaging (MRI) as well as data on age and gender to collate data and learn to recognise identifiable markers of the disease in its early stages of development.


The strategy, based on a deep-learning algorithm, is one of many machine-learning projects seen across the globe today. In this iteration, the AI allows software to continuously learn from data and improve from what could be fairly be described as ‘experience’. Each time the algorithm learns from a new substrate of data, it incrementally improves upon its initial capacity of diagnosis until patterns are found in data that even human practitioners may miss during examination of the available data.


The improvement over a typical neurologist was small, researches admit when discussing their findings. Expert neurologists proved very capable of spotting the likely precursors to AD, though it pays to notice that the researchers conducting this study consulted what would likely constitute the top of their field.


In order to compare their algorithm’s findings against the best human equivalent, the researchers consulted with experienced and capable neurologists. Not every patient across the globe will be able to access the most proven and proficient practitioners in regards to both the diagnosis and the subsequent treatment of AD. Additionally, the algorithm that has proven to be marginally better than the experts consulted in discovering the early signs of AD is also essentially an automatic process. Data is assessed and patterns are discovered in the patient without the direct intervention of a human observer.


Naturally, this AI would not serve as a diagnostic tool in isolation, it would generate alerts to be passed over to the experienced eye of its human counterpart and diagnosis would stem from there.


Though it is interesting to see just how useful machine-learning and an algorithmic approach to recognising patterns truly is.

©2020 by Tower Global

  • Twitter
  • Facebook