Healthy Living

How Minimal Residual Disease Assessment Emerged as an Important Tool for Blood Cancers

Assessment Emerged as an Important Tool for Blood Cancers

How Minimal Residual Disease Assessment Emerged as an Important Tool for Blood Cancers

While scientists have made a number of breakthroughs in their attempts to understand cancer, and while these breakthroughs have often led to new and improved methods of treatment for patients, there are still hundreds of questions that remain unanswered. Blood cancers can be particularly difficult for researchers. According to some statistics, blood cancers have been the targets of a huge percentage of research in recent years. The amount of research devoted to blood cancers is partially due to the high rate of occurrences in the population, especially among young people, but also because these cancers take place at a minute level that can often reveal information about cancer broadly.

In the ongoing fight against cancer, and especially forms of blood cancer, MRD is emerging as an important tool.

What is MRD?

MRD stands for minimal residual disease. The abbreviation MRD is often used on its own, although when researchers talk about MRD as a tool they are typically referring to minimal residual disease assessment. MRD itself refers to the miniscule amount of cancer that may remain in a body even after treatment has been completed successfully. The level of MRD that can cause a relapse and methods for making sure that a person is truly cancer free are currently topics of research. MRD assessment is the processes that scientists are using to both detect whether or not traces of cancer remain in a patient and to explore what those low levels of the disease may reveal about cancer more broadly.

The role of next-generation sequencing

Any discussion about MRD would be incomplete without also discussing the importance of next-generation sequencing. When scientists first began trying to map human genomes, the process took an incredibly long amount of time and could cost close to three billion dollars to complete. There were multiple factors involved that made the process so laborious. On the one hand, scientists didn’t have the technology they needed to accurately observe human DNA on a molecular level. As this technology began to emerge, it was still incredibly rudimentary. Not only did use take expertise, but it was also costly to operate. At the same time, there wasn’t any kind of cache of information about human genomes. While scientists had the ability to construct an entire genome, that genome would then still exists in isolation. This meant both that scientists had to build from the ground up which took more time and effort, but also that they didn’t have any models to compare genomes to in order to look for aberrations.

With the advent of next-generation sequencing, all of that is changing. Next-generation sequencing is essentially technology that recreates a human genome much more quickly and efficiently. There are actually a variety of different specific methods that are lumped under the umbrella of next-generation sequencing, but what all of them have in common is that they are able to run through human DNA or RNA at a fraction of the time and cost that it took scientists before these technologies emerged. In addition to the development of next-generation sequencing tools, many health care facilities are also collecting human genomes in biobanks. The larger these biobanks become, the more information scientists have to compare genomes to and the easier it will be to discover abnormalities in human genes.