In 2012, IBM estimated that 2.5 exabytes (i.e., 2.5 billion gigabytes) of data was generated every day. What that figure is today has become difficult to estimate because the volume is growing so fast. However, the main question that has arisen is how can healthcare providers utilize the tools and advanced analytics that accompany vast amounts of data when treating and diagnosing patients?

While previous practice methodology has relied on prescribing the medication that has statistically been effective for the most amount of patients with the same condition (evidence-based medicine), the vast amount of data has penetrated every market, including healthcare, has led to the rise of precision medicine. Today's technology allows us to sequence a person's entire genome more rapidly and more affordably than previously; at the same time we can take all the evidence-based historical records and combine that with all of an individual patient's data to determine the best course of treatment. As this methodology progresses and develops, AI uses deep learning algorithms continue to improve itself and make effective diagnoses. While it is certain that the exponential growth of technology has produced a need for precision medicine and AI, its uses are still in the stages of infancy and we're just seeing the surface of what is possible.