A new proof-of-concept study reports evidence that a new testing method has the potential to rapidly identify radiation sickness based on biomarkers measured through a single drop of blood. Scientists at The Ohio State University Comprehensive Cancer Center - Arthur G. James Cancer Hospital and Richard J. Solove Research Institute (OSUCCC - James) say the test could help save lives through early and real-time identification of the condition to enable timely clinical interventions.
Radiation sickness, or acute radiation syndrome (ARS), is a condition caused by irradiation of major volume or the entire body by a high dose of penetrating radiation in a very short time period - usually a matter of minutes. Historically, this has been most relevant through accidental exposures or mass casualty radiologic events, like the ones witnessed in Hiroshima and Nagasaki during World War II or even a reactor accident such as the one at Chernobyl in 1986.
The condition can rapidly weaken a person through its side effects and lead to death without intervention. The current diagnostic test ¬- a dicentric chromosome assay - requires three to four days to get results. ARS most often impacts the bone marrow and gastrointestinal systems early while the debilitating effects on pulmonary, cardiovascular and central nervous systems can be delayed. Death can occur in a matter of days for the most severe cases, but most patients die within several months of exposure. Rapid identification of exposure levels is critical for responding and triaging patient treatments.
This new test uses a single drop of blood - collected from a simple finger prick - and results are ready in a few hours. It is rapid, scalable and can serve as a point-of-care-type diagnostic tool for real-time evaluation to screen a large number of individuals in a short time."
Naduparambil K. Jacob, PhD, Associate Professor and Scientist, OSUCCC - James Translational Research Program
For this test, researchers compare the relative expression of two small molecules called microRNAs in the blood. The first is microRNA-150 ¬- which Jacob's lab identified several years ago as a biomarker to measure the extent of bone marrow damage. This microRNA decreases as a function of radiation dose while the normalizer, called microRNA-23a, does not change. Comparing these two molecular measures allows scientists to quantify the actual radiation dose absorbed, and therefore the overall exposure risk.
"We measure ionizing radiation in grays. People who are exposed to two gray need to be identified and treated and it is predicted that if you are exposed to about four gray to the whole body, without timely treatment there is a 50 percent chance of survival," says Jacob.
He noted this tool would have critical relevance in responding to mass casualty disaster scenario like that Chernobyl, to identify at-risk military personnel and civilians who need immediate treatment. It also has relevance for cancer patients, especially bone marrow transplant patients and others who have intense radiation therapy, where overdosing as well as underdosing is of concern.
"Some patients develop major issues like thrombocytopenia and neutropenia as the result of radiation treatment. We can't look at a patient and determine how much radiation he or she has absorbed - but the impact can be cumulative. As a result, radiation sickness could occur weeks or months after the radiation therapy," explains Jacob. "With additional research, this new testing method could potentially help oncologists measure - in real time - absorbed radiation and intervene before radiation sickness occurs."
Jacob and his colleagues report their findings in the medical journal Science Translational Medicine on July 15.